Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Chip Promises AI Performance in Games

Zonk posted more than 7 years ago | from the super-tech dept.

252

Heartless Gamer writes to mention an Ars Technica article about a dedicated processor for AI performance in games. The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics. From the article: "AIseek will offer an SDK for developers that will enable their titles to take advantage of the Intia AI accelerator. According to the company, Intia works by accelerating low-level AI tasks up to 200 times compared to a CPU doing the work on its own. With the acceleration, NPCs will be better at tasks like terrain analysis, line-of-sight sensory simulation, path finding, and even simple movement. In fact, AIseek guarantees that with its coprocessor NPCs will always be able to find the optimal path in any title using the processor." Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?

cancel ×

252 comments

Sorry! There are no comments related to the filter you selected.

hm (2, Insightful)

JeanBaptiste (537955) | more than 7 years ago | (#16053990)

sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK.

I dont think we are going to get any good AI until it has some method of "learning"

Re:hm (4, Funny)

Snarfangel (203258) | more than 7 years ago | (#16054052)

sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK.

This will suck 200 times faster, though. That's like a straw compared to a fire hose.

Re:hm (5, Funny)

orasio (188021) | more than 7 years ago | (#16054157)

This will suck 200 times faster, though. That's like a straw compared to a fire hose.

Fire hoses don't suck. You need a more visual analogy.
Maybe something like this:

  "That's like a tick compared to your mother!"

Re:hm (2, Funny)

Terminal Saint (668751) | more than 7 years ago | (#16054610)

At least the tick shuts up occasionally...

Re:hm (1, Funny)

Anonymous Coward | more than 7 years ago | (#16054781)

"That's like a tick compared to your mother!"
I AM a tick you insensitive clod! (/spoon!)

Re:hm (4, Interesting)

MBCook (132727) | more than 7 years ago | (#16054091)

I read the blurb this morning. The idea is that it accelerates the basic operations that everything uses (line of sight, path finding, etc.). The more complex AI (actual behavior, planning, etc) is built the normal way. It simply offloads the CPU and thus allows faster calculations.

The other real difference is that it is better than current algorythms. So instead of using A* for pathfinding, it works correctly even on dynamicaly changing terrain. This would mean things like no longer having NPCs getting stuck on rocks or logs or some such (*cough* half-life 1 *cough*).

Re:hm (1)

SnowZero (92219) | more than 7 years ago | (#16054439)

it is better than current algorythms [sic]

You do realize that A* if from 1968, and that things have improved a bit since then, right? It is better than algorithms used in current video games maybe, but that's as far as I'd take it.

hm-Modding "/.". (0)

Anonymous Coward | more than 7 years ago | (#16054165)

"sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK."

So much for "chipping" slashdot posters.

Re:hm (2, Insightful)

Da Fokka (94074) | more than 7 years ago | (#16054217)

sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK.


That's largely because most CPU cycles go to the pretty graphics. More computing power might help the AI in some games (although many AI routines are basically flawed anyway). This chip offers a more powerful tool to the AI programmer. It's still up to him to make an AI that's not totally stupid.

Al? (4, Funny)

gEvil (beta) (945888) | more than 7 years ago | (#16053993)

Who is Al and why do I want him controlling everything in my games?

Re:Al? (1)

creimer (824291) | more than 7 years ago | (#16054088)

If the almighty AI wasn't screwing up your game, life would be boring.

Separate box just for the gaming HW? (2, Interesting)

Zanth_ (157695) | more than 7 years ago | (#16053996)

What may occur is a separate box consisting of the GFX card, Physics Card, AI card, PSU for the above along with supporting memory modules just to power existing games. Mulitple cards consisting of mulitple chips with multiple cores will likely overpower the common case. Thus for the hardcore games, a separate box wired to the main rig could be the norm. Thus, for the average home user, we will get smaller and smaller (Mac mini et. al) but for the gamer we'll see module system, with multiple boxes and multiple PSU's to help with cooling and overall performance goodness.

Re:Separate box just for the gaming HW? (5, Insightful)

gEvil (beta) (945888) | more than 7 years ago | (#16054056)

And then they can take everything and put it all in a big case with the a monitor and speakers and a special panel with the controls on it. And then all you need to do is put a slot in the front that says $1.

Re:Separate box just for the gaming HW? (5, Insightful)

gregmac (629064) | more than 7 years ago | (#16054090)

What may occur is a separate box consisting of the GFX card, Physics Card, AI card, PSU for the above along with supporting memory modules just to power existing games.

what an [microsoft.com] interesting [nintendo.com] idea [playstation.com] .

Re:Separate box just for the gaming HW? (2, Interesting)

MBGMorden (803437) | more than 7 years ago | (#16054437)

Ok, link speak is annoying. Don't do that ;).

The systems you mention though are all scaled down computers. They don't really have any extra hardware that a standard computer doesn't have. The GP's comment seemed to be talking about putting all the "extra" hardware out of the case, which doesn't fit your model of just making a smaller and more focused computer.

Re:Separate box just for the gaming HW? (1)

Joe The Dragon (967727) | more than 7 years ago | (#16054722)

some thing like this with pci-e and HyperTransport links and this also moves the heat form the video cards out of the case.
http://www.nvidia.com/page/quadroplex.html [nvidia.com]

What a fascinating idea! (0)

Anonymous Coward | more than 7 years ago | (#16054612)

You mean a separate box just for games? What would this mysterious "play station", look like, what would it cost? I'm having trouble imagining what this "game cube" or "entertainment system" would cost, and who would buy it. Who would write games for it? It's an unknown, or an "X box", if you will.

Nah, it will never happen. People like to use their hardware for word processing and spreadsheets, not shooting aliens. These aren't "game boys" we're talking about here.

Now if you'll excuse me, I have to go Wii.

Well.. (3, Funny)

geniusj (140174) | more than 7 years ago | (#16054003)

Well if Chip promises it, I believe him..

Not Gonna Work (4, Insightful)

TychoCelchuuu (835690) | more than 7 years ago | (#16054005)

The physics card could theoretically work because if the player doesn't have it, you could always leave out some of the eye candy and only calculate fancy physics for objects that affect gameplay. With an AI card, you don't have that luxury. Either they player has it, or you have to just dump all the AI (obviously not) or do it all on the CPU, which begs the question: why program your game for a dedicated AI card if you're just going to have to make it work on computers without one?

Re:Not Gonna Work (2, Interesting)

arthurh3535 (447288) | more than 7 years ago | (#16054033)

Actually, they could do something similar to the graphics, just allowing for "weaker" AI routines that can work on standard system.

It will be insteresting to see if games are "more fun" with smarter AI or if AI really isn't the big and important thing about making interesting games.

Re:Not Gonna Work (1)

kailoran (887304) | more than 7 years ago | (#16054062)

There's nothing really new to find out - when the game AI's so sucky that it has to seriously cheat to be an even opponent for the average player, then it's not very entertaining.

Re:Not Gonna Work (3, Insightful)

MBGMorden (803437) | more than 7 years ago | (#16054494)

The flip side though, is that if most AI's could really think well, almost no human could compete with them in a game. You'd loose almost every time.

Re:Not Gonna Work (1)

CastrTroy (595695) | more than 7 years ago | (#16054814)

Although, when you're playing games like Go [wikipedia.org] it might be a good idea to have more intellingent AI, so that computers can beat the mediocre players.

Re:Not Gonna Work (1)

nschubach (922175) | more than 7 years ago | (#16055027)

Or you would figure out the AI on your AI chip and be able to beat every game you play using the same techniques...

Re:Not Gonna Work (1)

Hellken242 (897869) | more than 7 years ago | (#16054159)

The best AI you can get is playing people on a LAN or on the net and most gamers seem to think that is quite fun. Even if some of those noobs are mentally challeneged...

Re:Not Gonna Work (1)

Psiven (302490) | more than 7 years ago | (#16054061)

The real promise of PPUs, I believe, is when they can feedback to the CPU and affect gameplay. Eye candy is nice, and they'll have to only be used for that, atleast for multiplayer games, until it becomes mainstream. But honestly, PhysX has a long uphill battle with card based PPUs when the likes of ATI, nVidia, Intel, and AMD all have their own ideas for physics co-processing.

Naah, not anytime soon (1, Redundant)

Gr8Apes (679165) | more than 7 years ago | (#16054119)

I think that AI and physics co-processors have a better future as part of the CPU, rather than an add-in board. Perhaps as additional core(s) on an AMD processor, with full access and feedback to the CPU proper?

Re:Naah, not anytime soon (2, Interesting)

legoburner (702695) | more than 7 years ago | (#16054194)

I am not so sure, as I think that some sort of programmable PCI-X card is going to exist sometime soonish which will allow programmeable hardware processing of simple routines like line of site or pathfinding (or other mathematical problems), and this will offload from the CPU. This would be more logical as it can then be used in many different applications from custom rendering for render farms, to hardware-assisted protein folding through to complicated firewalling/packet sniffing and back to gaming and other desktop usage. They have been in development for a while now, I just wonder when they will start becoming available to the consumer.

Re:Naah, not anytime soon (1)

Hast (24833) | more than 7 years ago | (#16055036)

Yeah, they already to those. They are called multi-cores.

The biggest problem with this AI card is (besides being vapourware) is that there is no way in hell you can beat CPUs at their own game. (Executing branchy code.)

I can see use for an AI library though. It doesn't seem like these guys are really clued in enough for that though.

Re:Not Gonna Work (0)

Anonymous Coward | more than 7 years ago | (#16054111)

To give better framerates to those who have an AI card.

Re:Not Gonna Work (1)

MBCook (132727) | more than 7 years ago | (#16054171)

But that only affects multiplayer. In single player, you can just use the dumber routines if they don't have the card. This especially appiles to small creatures who aren't doing much more than pathfinding. In a GTA type game you can just put more people/cars on the street. There are circumstances where it would be perfectly possible.

The biggest problem is multiplayer where you basically have to have everyone require it or force everyone to use software.

Re:Not Gonna Work (1)

lawpoop (604919) | more than 7 years ago | (#16054173)

It sounds to me like the same problem with the physics card. You still have to code default behavior for objects if they don't have a physics card. You don't get default behavior for free, only errors and crashes.

So if the player doesn't have an AI card, you turn off some of the 'mind candy', and have stupider enemies.

Re:Not Gonna Work (4, Interesting)

Dr. Spork (142693) | more than 7 years ago | (#16054280)

I think this is the right question, but it may have an interesting answer. Maybe the way they picture future game AI is similar to present-day chess AI: Chess games evaluate many potential what-if scenarios several moves into the future, and select the one with the best outcome. Clearly, the more processing power is made available to them, the more intelligently they can play.

Maybe future RPG AI could have some similar routines regarding fight/flight decisions, fighting methods and maybe even dialogue. But that would require a pretty universal processor, which would just speak for getting a second CPU. I don't have much hope of this catching on, but I'd welcome it. For one thing, writing AI that can run in a separate process from the rest of the game is something I'd love to see. I want something to keep that second core busy while I'm gaming!

Plus, it would be pretty cool for hardware manufacturers if AIs really got smarter with better hardware (be it CPU or add-in card). That would require big coding changes from the way AI is written now, but I do think those would be changes for the better.

useful for game servers? (4, Interesting)

j1m+5n0w (749199) | more than 7 years ago | (#16054339)

why program your game for a dedicated AI card if you're just going to have to make it work on computers without one?

Perhaps the card could be most useful not on the client, but in dedicated mmorpg servers. I know WoW could definitely use some smarter mobiles. Sometimes I think whoever designed the AI was inspired by the green turtles from Super Mario 1. I'd like to see games with smarter mobs and NPCs, and any game with a realistic ecology (for instance, suppose mobs don't magically spawn, they procreate the old fashioned way, and must eat food (a limited resource) to survive) would require many more mobs than a WoW-like game in order to prevent players from destroying the environment. Simulating millions of intelligent mobs would likely be very expensive computationally.

Stupid question... (3, Interesting)

UbuntuDupe (970646) | more than 7 years ago | (#16054463)

...that's I've always wanted the answer to from someone who knows what they're talking about:

For the application you've described, and similar ones, people always claim it would be cool to be able to handle massive dataprocessing so you could have lots of AI's, and that would get realistic results. However, it seems that with *that many* in-game entities, you could have gotten essentially the same results with a cheap random generator with statistic modifiers. How is a user going to be able to discern "there are lots of Species X here because they 'observed' the plentiful food and came and reproduced" from "there are lots of Species X here because the random generator applied a greater multiple due to more favorable conditions"?

I saw this in the game Republic: the Revolution (or was it Revolution: the Republic?). It bragged about having lots and lots of AI's in it, but in the game, voter support in each district appeared *as if* it were determined by the inputs that are supposed to affect it, with a little randomness thrown in. The AI's just seemed to eat up cycles.

Long story short, aren't emergent results of a large number of individual AI's essentially the same that you would get from statistical random generation?

Might work for MMORPGs (1)

Lonewolf666 (259450) | more than 7 years ago | (#16054388)

So far, those seem really short of AI. Maybe because they have less computing power per player (that server farm must be affordable). With dedicated AI cards for the servers, MMORPGs might be able to catch up to newer single player games that have at least half-decent AI.

Easy (1)

cgenman (325138) | more than 7 years ago | (#16054926)

On games I've worked on in the past, we had a global strategizing algorithm that ran once every few seconds (over the course of a bunch of frames), more localized map sectional AI that ran slightly more frequently, per-unit pathfinding that ran (incompletely) every second, and moment-to-moment movement that ran every frame.

Now, if we could run all of those AI routines every frame, the game would appear a bit smarter. It wouldn't have a delay upon reacting to stimulous, the pathfinding could run a character intelligently across the map without bumping into dead ends, New units would path immediately instead of waiting for the next global strategy cycle, etc.

Not a major update, but a perfectly scalable one.

Re:Not Gonna Work (1)

Rifter13 (773076) | more than 7 years ago | (#16055008)

Yea, and all games coming out today, will run in software renderers. Even if it is not really plausable today... tomorrow, there may be enough penetration, to make it like Video cards. Where you have to have a high-end one, to play video games.

6th Post (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#16054022)

EIGHTH POST!!!

Client-side Multiplayer AI (4, Insightful)

w33t (978574) | more than 7 years ago | (#16054028)

Something that's always bugged me a bit about expansion boards is that the experience can only be enjoyed by the user with the board.

For instance, in a multiplayer game, some players will obviously be getting better graphics than the rest - but often the maps are tailored to work equally well (or at least as equally as possible) to low-end and high-end video cards.

And then there is this new physX card - which sounds like a neat idea, but you have the same kind of situation. You can kind of model physics looking a bit better for the player with the card - but all actual physical actions must be reproducible for the non-card having players.

Now, here is where I think the AI card could be different: distributed processing.

Let's take two human players and 4 AI players in a multiplayer game. Normally the server would be responsible for the AI decision-making processing and would pass to the clients only the x,y,z movement and animation data as a network stream. The AI thinking would take place completely free of the client machines. This puts strain on the server's resources.

Now, imagine rather than the server processing and the clients recieving network info you were to turn this on it's head.

Have the clients process a subset of the AI - say, 2 AI for player 1's machine, and 2 AI for player 2's machine. Now both clients will send the AI's movement information to the server. From the server's point of view the AI would require the same processing power that a regular human player would require (very little - relatively speaking).

With the plethora of bandwidth available client-side these days I think this kind of idea is very realistic.

Re:Client-side Multiplayer AI (1)

Grr (15821) | more than 7 years ago | (#16054163)

There are some types of games where this might work, I think guildwars uses this for the henchmen if not for large parts of the instance. But for many games which are strictly competitive, the risk of client side tampering, making the AI cooperative or making knowledge that only the AI should have available to the local player is a big risk.

Re:Client-side Multiplayer AI (1)

w33t (978574) | more than 7 years ago | (#16054407)

That is a good point.

I wonder though, if you could create AI who have only the same information a player would have (so to speak).

Is it conceivable to create a method where the AI is presented with a view of the game that would closely match the level that a player would experience?

For instance, in a typical game the AI is not so much a distinct entity as just a grouping of functions and variables with certain access rights to other objects within the program. This makes current AI just another aspect of the game.

But imagine if you could create a presentation layer of the game and then have the AI run as a seperate process altogether - rather than simply a thread in the game. This presentation level would not so much give the AI access to the games variables as present a 'view' of the game world. It would not so much be an API as a AID (AI interface Device) perhaps.

It's a fun idea. I admit there are many problems, but none truly insurmountable I would think.

Re:Client-side Multiplayer AI (1)

Scoth (879800) | more than 7 years ago | (#16054578)

Wrong issue, though. It's not so much an issue of "cheating" AI that are given information a player wouldn't, but a player using the AI info on their computer in ways they shouldn't. For example, on a game that has map exploration like a RTS game, imagine being able to tap into the AI's viewport and gaining all the exploration knowledge they have. Or a FPS with radar showing enemy locations, you could tap into the AI to either use it's radar too if it's supposed to be a friendly AI. Or, if the AI is on an enemy team, it's info might well have the locations of all the enemies on the team.

This is all data that any legit player would have, but it can be dangerous to have this info sent to clients that aren't that player. The classic Aimbot is what can happen when you trust client data more than you should, not to mention wallhacks and similar. It would be conceivable that it could be encrypted or tagged somehow, but we all know how well that sort of thing works.

Re:Client-side Multiplayer AI (2, Insightful)

Anonymous Coward | more than 7 years ago | (#16054196)

Having worked in the industry on a MMORPG, I agree that it would be nice to use client machines for extra distributed processing, but there are issues.

First, as a rule of thumb in multiplayer development, never trust the client machines for anything other than controller and view data for that player. The client machine is hackable, unlike (supposedly) your server. They can wrap .dll's so that they can modify and view data, in your case data that they may not legitimately have access to, such as "what's beind that wall", and can inject "smart code" of their own to limit the updates the game sends back to the server. Don't trust your players.

Second, although bandwidth is going up, having a remote machine control AIs in-game introduces two issues (off the top of my head). First, if the client machine hosting the AI goes down or loses its connection to the server, the AI will become "brain dead" to all other players in that general area, at least until a new AI controller can be spawned somewhere. As well, that AI has then lost its memory of everything it has experienced up to that point. Second, although bandwidth has increased, latency is still your issue. No longer are state updates for your NPC coming only from the server, they are coming from some remote machine, hitting the server, being validated (if the server is smart at all) and then being propagated to all other players in that area. That introduces a validation check and a second hop, which will slow down AI responsiveness.

We thought about this a fair bit where I worked. We decided that it just wasn't doable.

Nice dream, though. :-)

Re:Client-side Multiplayer AI (1)

SnowZero (92219) | more than 7 years ago | (#16054822)

We thought about this a fair bit where I worked. We decided that it just wasn't doable.

One thing I could imagine doing for a game is offloading just the path planning. You could make it a fairly dynamic thing where no particular client has responsibility for an NPC. Each client would recieve a source and target position for some NPCs, and the client could plan paths and send back a nearby waypoint for each NPC that will take it partway to its goal. You could assign multiple client computers the same problem and pick from the returned results randomly. This would only work for fairly static worlds, but AFAIK most MMORPGs don't allow dynamic obstacles (one reason for not having players be obstacles).

Re:Client-side Multiplayer AI (1)

ZachPruckowski (918562) | more than 7 years ago | (#16054225)

That makes it a lot easier to cheat and make the AI you run either super-smart or braindead. If you can do that, you just solo and wipe the floor with brain-dead AIs until you have enough treasure/levels/skillz to take on anyone.

It might take awhile.... (5, Funny)

Anonymous Coward | more than 7 years ago | (#16054050)

In fact, AIseek guarantees that with its coprocessor NPCs will always be able to find the optimal path in any title using the processor.

Aren't many problems of that ilk NP-complete?

Decrease the constant factor (1)

tepples (727027) | more than 7 years ago | (#16054324)

True, finding the optimal path is often NP-complete, but if the AI card decreases the constant factor in the majority of cases, this could still be a win.

Re:Decrease the constant factor (1)

26199 (577806) | more than 7 years ago | (#16055077)

Hmm. Path finding isn't NP complete -- polynomial, at worst. So it's quite solvable.

It's only NP complete if you have some weird requirement like 'visit all cities' as in the Travelling Salesperson Problem.

And anyway in general if a problem is NP complete then people can't do it either, so you don't need a full solution...

Yet another waste, years late (4, Insightful)

MaineCoon (12585) | more than 7 years ago | (#16054058)

The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics.


They want to completely ruin game performance by killing the PCI bus bandwidth and causing the GPU to stall waiting on the position/orientation and generated geometry that it will have to render?

Physics and AI coprocessors are 2 years too late - with the increasing availability of dual core processors in even midrange consumer systems now, and quad core on the horizon, engineering time is much better spent on making an app multithreaded so that it runs efficiently on hyperthreaded and dual core machines, instead of trying to offload it to a coprocessor that few customers will have. For a consumer, it is a better investment to spend an extra $50 to $100 for a dual core processor than spend $300 on a physics or AI coprocessor.

I doubt, and openly mock, their claims of '200x' speedup. I imagine it will be more like speeding up the process of $200 leaving foolish consumers' wallets.

ya, SMP anyone? (2, Insightful)

snarkasaurus (627205) | more than 7 years ago | (#16054309)

I agree. Intel just released dual core chips, AMD has them already and is about to release quad core chips, plus we have -cheap- dual processor boards available. That'd be eight cores, as soon as AMD releases their new kit.

Even Windows is shipping with SMP available, we have processing capability out the wazoo pretty much. Should be able to handle any AI requirements I'd think and have room to balance your checkbook at the same time.

Some clever lad should be able to design a bot that doesn't do the same thing every single time, eh? Maybe learns to check that blind spot before it sticks its head out. Now THAT would be fun! Better than new eye candy, for sure.

Completely off base (2, Informative)

everphilski (877346) | more than 7 years ago | (#16054329)

They want to completely ruin game performance by killing the PCI bus bandwidth

Positional updates to a character in the game are very low bandwidth - I mean, MMO's do this all the time and don't saturate network connections, much less a PCI buss. The calculations are heavy but the input and end result are just a few numbers, plus a terrain map you would load once and forget until you zone, at which time a little latency is happening anyways.

causing the GPU to stall waiting on the position/orientation and generated geometry that it will have to render?

Read carefully. It isn't generating terrain, just sending around updates. Diffs between 2 meshes don't have to be big. The mesh will probably stay the same, just relocate. Send an array of updated points with the corresponding indices. It isn't hard to imagine that a dedicated processor could do these things significantly faster than a processor that is already breaking its behind doing thousands of matrix transformations, player calculations, sound and graphics effects, etc.

Re:Yet another waste, years late (3, Informative)

j00r0m4nc3r (959816) | more than 7 years ago | (#16054368)

I fully believe their claim is totally realistic. With a dedicated circuit to process A* or Dijkstra's algorithm (or solve generic network traversal problems) you could very easily beat a general-purpose processor by 200x. While computer CPU's are very good at doing a lot of different things, they generall y suck at doing specific things extremely fast. A dedicated DSP chip for example can easily outperform a general-purpose processor doing a DSP subroutine by 200x. If they can make these things cheap enough where they can start getting integrated into video cards or whatnot without affecting the price too much, I think they may have a chance.

Re:Yet another waste, years late (2, Interesting)

Targon (17348) | more than 7 years ago | (#16054502)

PCI is on it's way out, PCI Express is the next stage, or HTX(HyperTransport slot).

Dedicated co-processors are a good idea, the problem is the costs involved. AMD is pushing for these companies to just make their new co-processors work with an existing socket type so instead of trying to sell cards(which cost more money to make due to the PCB), we will buy just the chip itself.

To be honest, this is a better way to go since if a GPU were implemented in this way, you could easily just buy a GPU, toss it on the motherboard, and bingo, easy upgrade without the expense of buying a new graphics card and memory. Sure, you might see generational jumps in the memory used for the graphics and how it connects to the GPU, but that problem could be solved in multiple ways.

Re:Yet another waste, years late (1)

grumpygrodyguy (603716) | more than 7 years ago | (#16054564)

Physics and AI coprocessors are 2 years too late - with the increasing availability of dual core processors in even midrange consumer systems now, and quad core on the horizon, engineering time is much better spent on making an app multithreaded so that it runs efficiently on hyperthreaded and dual core machines

I agree that multithreaded game engines are probably the wave of the future, but I would still love to see these physics and AI co-processors integrated onto video cards. Expecting gamers to pay $200 for a PCIe physics card, and another $200 on a PCIe AI card is obviously not going to fly.

Solution:

1) Don't make a separate card
2) License these chips to nVidia or ATI so they can place the chips directly onto their video cards. Then bundle the drivers with the graphics driver download, and work with Microsoft to ensure that the next version of DirectX supports these new features.
3) If that's not feasible (video card is too bulky, not enough bandwidth through the single PCIe16 or PCIe32 slot, etc.), then integrate the physics and AI chips onto motherboards specifically designed for gamers.
4) As a last resort, get the physics and AI guys together so they can release a single card that's reasonably priced...and make sure they work closely with Microsoft for DirectX support.

We need GPUs because graphics processing is highly parallelized, something which is terribly innefficient on general purpose CPUs. In the same way, if physics and AI processing can benefit dramatically from a specialized architecture then it makes sense to build these chips. Even a high-end quad core CPU can't compete with a 32 pipeline graphics card.

Re:Yet another waste, years late (1)

k_187 (61692) | more than 7 years ago | (#16054682)

well, there will always be people that will buy crap so they can be the 1337est. I'd say that getting MS to put this stuff into DirectX and then getting developers to use it will spur enough adoption of the cards to bring prices down. It wasn't that long ago that GPUs weren't needed. It took quake and voodoo together to get people to realize what the difference could be. What are these people doing to illustrate that difference?

Wrong (0)

Anonymous Coward | more than 7 years ago | (#16054835)

You've done a very good job at being condescending so people mod you 'insightful', but you don't really know what you are talking about.

There is no way that a generic CPU, even four of them, could compete with a custom piece of hardware. If you don't believe this, try running any first person shooter on a dual core machine with your graphics card drivers uninstalled. It will look like a piece of crap.

In my digital design laboratory we made a video processing unit on a 100 MHz FPGA and compared its performance to a 650 MHz CPU. The FPGA was hundreds of times faster, and we weren't even doing anything clever. These custom AI/Physics cards have many cores with parallel computing power and insane memory bandwidth so they can pump many, many more computations through than a generic CPU.

Of course, the 'average user' would get more mileage out of an extra core instead of this AI card. But they were never intending to market this to the average user. Personally, I think this will have a very hard time catching on for all the reasons other people have already stated.

Re:Yet another waste, years late (1)

Das Modell (969371) | more than 7 years ago | (#16055169)

I can just imagine the future: two GPUs, a PPU and an AIPU all running with coolers along with the CPU (quad core, obviously) and a few hard-disks and optical drives. We'll need some sort of alien technoloqy to keep those systems running.

Hopefully it won't come to that.

AI and Phys (1)

Broken scope (973885) | more than 7 years ago | (#16054072)

I find this interesting, but i have to agree with above posters. My line of thought is when it comes to multiplayer games the server needs to be doing things like physics calculations. The server should also do AI. I wonder when we will see the first MMO with a realistic physics model. I want to see a MMO that starts up AI to go about the universe when their are very few players on, or to help the game be fun when it is starting out and their aren't that many people on.

Re:AI and Phys (1)

MaineCoon (12585) | more than 7 years ago | (#16054186)

Clients have to do physics as well, for client-side prediction.

Unless we want to go back to the "good old days" of Quake 1, with jerky networking and opponents who appeared to pop all over the place when the network got congested...

Specific hardware (1)

Rob T Firefly (844560) | more than 7 years ago | (#16054074)

Will games powered by specific pieces of hardware become the norm?
Many games have been over the past three decades or so. They're known as console games.

If things continue in this direction, it looks like we may be buying game consoles to hook to our computers instead of our televisions.

Lockout chip business model (1)

tepples (727027) | more than 7 years ago | (#16054396)

it looks like we may be buying game consoles to hook to our computers instead of our televisions.

Will video game consoles for computers come with the same systematic bias against smaller game developers that video game consoles for televisions have traditionally come with?

No (0)

MooseMuffin (799896) | more than 7 years ago | (#16054081)

The AI card won't work for the same reason the physics card won't work. I'm not buying another card. PC gamers already need to spend an extra $200-500 for a graphics card in addition to the rest of the computer. I'm fine with that, but like hell if I'm buying 2 other cards for other features. The only chance these things have is if they get it all onto one card.

Re:No (1)

fbjon (692006) | more than 7 years ago | (#16054960)

I predict that high-end graphics cards will soon turn into "gaming cards".

Re:No (0)

Anonymous Coward | more than 7 years ago | (#16055053)

Maybe this allows you to spend money on what you want to have from the game... You like nicer dungeons? Buy the ultraexpensive graphics card. You think that smarter enemies are the way to go? Take the AI card.

Hopefully it doesn't lead to something like "Please insert another AI card to play on expert level". But on the other hand I remember that Magic Carpet needed 16MB of RAM in order to run in high resolution mode (640x480) and that cost maybe the same as an AI card nowadays ;-)

Weak Praise (1)

nick_davison (217681) | more than 7 years ago | (#16054085)

The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics.

Damn. And I hoped it'd actually be useful for AI.

The problem with PhysX is it costs similarly to a mid range graphics card and yet adds kind of performance gains of a first gen graphics card. Whilst GPUs are massively evolved compared to first gen offerings, PhysX in its first gen state is a really expensive nice little add on.

Don't get me wrong, when physics processors and AI processors make the kind of difference that having or not having a GPU makes now, that'll be an amazing thing. It's just that, in first gen form, with no competition yet pushing the market forward, PhysX has been met with a deafening yawn and saying this chip hopes to do the same doesn't really promise much.

Memories (1)

Artana Niveus Corvum (460604) | more than 7 years ago | (#16054087)

Seems to me that I recall these exact questions being asked about 3D accelerator cards a number of years ago. Why program for it if you're going to have to make it work without it anyway? For a few games, they just simply made the game not run if you didn't have the particular piece of hardware (this memory is tied directly to a version of a BattleTech-based 3D game that relied directly on you having a Matrox Mystique card, though I read about other things which relied on various Voodoo or... what was it... eh, it was another discrete 3D card that didn't even interface with the normal graphics card. I had one at one point. It was neat when it worked and sort-of supported Glide but with interesting distortions and broken-ness-es...

What ever happened to 3dfx? (1)

Travoltus (110240) | more than 7 years ago | (#16054367)

I've been using nvidia forever and a day since I switched out from the Matrox Parhelia.

When is Matrox coming out with something new?

Re:What ever happened to 3dfx? (0)

Anonymous Coward | more than 7 years ago | (#16055039)

They're launching a new card along with the release of Duke Nukem Forever... although it looks like 3DRealms might be ready before them.

Singing, dancing, GUI's (0)

Anonymous Coward | more than 7 years ago | (#16054099)

"Heartless Gamer writes to mention an Ars Technica article about a dedicated processor for AI performance in games. The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics."

Or both can be used to speed up Vista.

Re:Singing, dancing, GUI's (1)

zarthrag (650912) | more than 7 years ago | (#16054982)

Vista has physics??? *duck*

Why wouldn't we just use extra cores? (3, Insightful)

Rhys (96510) | more than 7 years ago | (#16054101)

Since the Mhz jumps of the past seem to be by and large behind us these days, but we're looking at more and more cores, isn't it time that games become multithreaded and offload that nasty pathing work to a second core? Sure you could buy stupid shiny cards for the game physics and AI and network (some sort of network booster that avoids the OS's TCP stack -- posted a while back I believe), or alternatly just make use of the extra hardware that /will/ be in the box anyway.

Now, the decent AI toolkit that folks can license might be worth it anyway, when they figure out they should just run it on the CPU instead of their custom CPU-like-thing.

Re:Why wouldn't we just use extra cores? (0)

Anonymous Coward | more than 7 years ago | (#16054905)

Because custom hardware is much faster than your CPU. Compare GPU for graphics vs. CPU for graphics.

You might argue that they are doing it on the CPU already, so they more CPUs = more AI, but at some point this stops working and CPU's aren't fast enough to do all the computation. Just like how real-time graphics didnt' really take off until video cards were popular.

The real benefit of these cards isn't to speed up the AI we have today in games, but to allow developers to use algorithms that they can't use right now. As other people have already stated, during the adoption period this must be optional so that the game can work on lots of hardware, so maybe the current MMO market prohibits this. But wouldn't it be awesome if you could play an MMO in a massive hundred vs. hundreds battle with intelligent AI that actually made good decisions and wasn't super easy to exploit? I think it would be rad.

Welcome to the Dot Bomb v2.0 (2, Insightful)

Bieeanda (961632) | more than 7 years ago | (#16054124)

One might think that the future is piecemeal, given the PhysX card, this thing and the even more ridiculous Killer(TM) NIC, but there's a few small things that the would-be bandwagoneers developing these things don't want to think about.

The first is money. A serious gamer who likes his bells and whistles might be expected to spend several hundred dollars every year or two, in order to make his games run at their prettiest and fastest. He still has a finite budget, though-- asking him to spend a similar amount on physics and AI hardware is unlikely to have the desired effect.

The second is developer support. Developers are stuck in an even bigger pickle: on the one hand, these devices (ideally and theoretically) provide new avenues for gameplay, but the moment that the hardware becomes necessary, they've eliminated a definite percentage of their market.

Three... are these things necessary, or even desirable? The original PhysX demo application, intended to show the effectiveness of the hardware by flinging crates around, ran perfectly smoothly on good hardware once hacked to remove the check for the PhysX processor. The Killer(TM) NIC is pretty much marketing snake-oil to anyone with any knowledge of networking. The 'need' for an AI coprocessor is pretty much obviated by faster main processors. Most games these days haven't been optimized for the multi-core processors that, unlike parlor tricks like PhysX, are actually growing in popularity. Wouldn't it be just that much easier for a developer to assign AI routines and meaningful physics interactions to idle processor cores, rather than constantly shuffling vital data back and forth between peripheral cards?

$279 (0)

Anonymous Coward | more than 7 years ago | (#16054513)

I didn't know what the Killer(TM) NIC was, so I looked it up. It's $279 for a NIC. That's all anyone need know. They throw some buzz words out there, but remember, $279. It's a space heater/NIC that takes up a PCI slot and costs $279. End of story.

$279

Everything old is new again. (1)

TheWoozle (984500) | more than 7 years ago | (#16054127)

As always, computer "innovation" goes in circles. Everything goes from dedicated hardware to software running on a general-use CPU, and back again. Terminals are replaced by PCs, which become hosts for remote interfaces to applications running on servers somewhere else.

It seems like everything I can do now I could do in 1990, but today I can do it *in hi-definition, wirelessly*. Yippee.

This is better for consoles (1)

Travoltus (110240) | more than 7 years ago | (#16054137)

Slap this into the Xbox 720 or PS3/4 and you get a mondo increase in NPC performance as long as the developers put in some rudimentary "learning" routines to keep things random. All gamers buying games that desire that NPC chip, get to enjoy the fun. Not so for the PC gamers. For PC gamers, game companies that make games for the most elite configurations - namely, those requiring the PhysX processor and this one - will have a lower percentage of sales per owners of PCs.

Re:This is better for consoles (1)

ThomasBHardy (827616) | more than 7 years ago | (#16054496)

Now all you have to do is define the standard AI interface for all games so that you coudl build an optimized chip for it. No problem right? I'm sure the AI needs of Quake, Need for Speed, GTA, World of Warcraft, Lumens and Poke Mon are all exactly alike.

Re:This is better for consoles (1)

cowscows (103644) | more than 7 years ago | (#16054645)

I'd imagine that for a lot of things, there are more similarites than you think. Stuff that has already been mentioned, like line of sight or path finding. One thing that most games nowadays have in common is that they're basically simulating three-dimensional spaces. What happens within those spaces changes, but there are certain qualities of those spaces that are pretty much always the same. Some objects are in front of others. Some objects are too far away to be seen. Most objects in the world cannot be passed through. And so on. Whether they're shooting aliens, or driving a car trying to chase down a criminal, most NPC's in today's games are inhabiting a simulated reality with rules pretty similar to real life.

Re:This is better for consoles (1)

ThomasBHardy (827616) | more than 7 years ago | (#16054872)

I think there are a lot of assumptions about the commonality of the framework of the game engine. The methodology of determining something like line of sight is a very different exercise in say World of Warcraft, where everything from your position in a 3-d environment as compared to a 2-D RTS game which is not creating a 3-d modeled world and is performing a map data examination to determine line of sight within it's own framework.

The sheer essence of games is that any given game does things it's own way in order to achieve it's own goals. This gives us diversity, progress and competition. In order to use canned AI routines, you woudl need to costruct your games around the commonality that the AI routines can accept. I think it would lower diversification.

The better question of "Can a Card Enhance AI" is "Why not use the high underused CPU in the soon to be ubiquitious Dual Core PCs instead?"

If they think they can make an AI card, they shoudl start by making an AI library and convince a statistically non-trivial portion of the game development comm unity that it works and see some games in release with it. Then you can spend the money to try and make cards that optimize the libraries. Any card they release before they even have a market will have the same lackluster reception that Aegis has had.

Re:This is better for consoles (1)

Fulcrum of Evil (560260) | more than 7 years ago | (#16055000)

Meh, if this became popular, it'd just be another required card for a gamer PC, much like the 3d accelerators.

MMO Servers? (1)

Kaenneth (82978) | more than 7 years ago | (#16054156)

I could see something like this used to lower the costs, and increase the scale of games like Everquest/World of Warcraft. Those games have dedicated server machines running AI's 24/7, for profit.

Re:MMO Servers? (1)

ThomasBHardy (827616) | more than 7 years ago | (#16054473)

Ok so Blizzard buys a few hundred cards... Sony buys another few hundred... NCSoft buys a few more hundred...

Now all you have to do is find someone to buy another 1.3 million to make them profitable for all the design work and manufacturering costs that went into them and you are golden ;)

way of the present (2, Interesting)

krotkruton (967718) | more than 7 years ago | (#16054176)

Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?

Short-run, maybe; long-run, no. IMHO, things will consolidate like they always seem to do. Video cards are necessary for more than just games, so they won't be going anywhere. Physics and AI cards seem to be useful for nothing but games. It would be foolish to combine all video cards with physics and AI chips because not everyone plays games, but why not combine the physics and AI chips? Farther down the road someone will come out with a new card to enhance some other aspect of gameplay, and eventually that will merge with the physics and AI chips on their own card.

Things are always being consolidated on PCs. Look at all the things on mobos that used to require separate cards 10 years or even 5 years ago. Designers get better and better at cramming more things into a smaller space (even if that is getting harder and harder to do), so it seems to me that these things will keep merging together when it is useful to do so. In this case, I don't think most PC users want to have 3-5 cards just for games, so it is useful. I could be completely wrong on that point though.

Who cares? (1)

Threni (635302) | more than 7 years ago | (#16054193)

> Is this the 'way of the future' for PC titles?

I mean, I care, but best-selling games has always been about licenses, tie-ins, snazzy graphics. If there's one thing I've learnt from playing (and to a lesser extent, writing) computer games is that nobody (statistically speaking) cares about gameplay and AI.

Besides, the best games I've played recently - ie the Battlefield series of games from EA - don't even use AI (unless you're in billy-no-mates single-player mode).

Physics + AI + Graphics = Game Card? (5, Interesting)

bockelboy (824282) | more than 7 years ago | (#16054198)

One begins to wonder what the "endgame" scenario for the respective manufacturers of the physics and AI cards we're seeing. I can foresee three distinct situations:

1) The CEOs, investors, and engineers are complete idiots, and expect all the gamers of the world to buy separate physics, AI, and graphics cards
2) They're hoping to provide chips to ATI or nVidia for a "game card" instead of a "graphics card", the next generation of expensive purchases for gamers
3) They're hoping to provide chips for the nextgen xbox / playstation / wii, hoping that their chips will be the ones to make gaming interesting again.

Need... bigger... motherboard (0)

Anonymous Coward | more than 7 years ago | (#16054211)

How am I supposed to pretend to have quad SLI, PhysX, this AI card, Soundblaster X-Fi, and the $279 'killer' NIC all at once, huh? It's just not fair!

For all inents and purposes... (1)

hrrY (954980) | more than 7 years ago | (#16054222)

This is a great concept, but all the same, more imposing on the Game dev's and hardware vendors. In my mind this is just another API that you would need to study the libraries for, and would consequently drive up the development time for a project; that in this day and age will slowly kill of hype before release, and then kill off SALES when the end-user discovers that the shiny new AI processor hinders the bliss of that which is 60+FPS... If you look at the PhysX cards and all the hype that it generated as an example, this kind of 3rd implementation actually makes the games perform sub-par. Also, I have not seen any tangible difference in-game between physics performance with PhysX and without PhysX; you will only see graphic's performance which does not equal "physics performance". So until the chipset manufacturers(Intel, Nvidia, Via) get involved with the providing solutions to allow these products do what they are advertised to do, they are all for not.

You mean way of the past, right? (2, Insightful)

Yvan256 (722131) | more than 7 years ago | (#16054239)

Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?
If you look at the Amiga, I think it had a CPU or co-processor for almost everything...

As for this new thing "doing the same thing as the PhysX processor", we'd have to see this PhysX processor in action (and on the market) first, wouldn't we?

So what? (1)

JensR (12975) | more than 7 years ago | (#16054261)

I assume you could accelerate A* with a dedicated chip - but that makes only a relatively small part of AI. Or you could accelerate Neuronal Networks, but most games I know use relatively plain state machines.
I'd move the pathfinding onto another thread, and with the gaining popularity of multi-core architectures you should get the same effect. That way you'd share most of the resources with the rest of the system, and wouldn't have to worry about sending everything over the bus to another card.

I know this is harsh and all but... (5, Insightful)

ThomasBHardy (827616) | more than 7 years ago | (#16054433)

Ok I do know I should be more tolerant of my fellow man and all that stuff, but really... this is just damned foolish.

Imagine the conversation that led to this...

-misty flashback fade-

Marketing Guy : Oh man, gaming is ready for a revolution!
Technical Guy : It's called a Wii now

Marketing Guy : Huh? We now what? -shakes head- I mean these gamers, they buy top end stuff, they have money to burn!

Technical Guy : Not really, they buy slightly under the curve and tweak up and overclock mostly
Marketing Guy : No no I read in a magazine that all gamers have more common sense than money
Technical Guy : -sigh-

Marketing Guy : These Ageis guys really whipped up a lot of frenzy about a new type of add on card.
Technical Guy : Yeah it's supposed to make the gamers run better by adding physics processing but the demo..
Marketing Guy : And they are making money hand over fist!
Technical Guy : Well, actually...

Marketing Guy : And it's so easy to make specialty stuff!!
Technical Guy : But their demo runs the same even without the card!

Marketing Guy : Wait, Wait, I got it! We'll make a card that adds more CPU power!
Technical Guy : Well dual cores add lots of CPu power that has yet to be tapped by games
Marketing Guy : No wait, even better, we'll make it special! That's what made the Ageis guys rich!

Technical Guy : Listen, the Ageis guys are not selling much, you might not want to...
Marketing Guy : We'll add better AI! That's IT!

Technical Guy : Better AI?
Marketing Guy : Yeah, we'll sell a card that makes the games run better!
Technical Guy : How's that work?
Marketing Guy : We'll umm, make it able to process AI commands like a graphics card processes graphics commands.

Technical Guy : But Graphics Commands are standardized, so they can optimize for that.
Marketing Guy : We'll get them to standardize AI commands.

Technical Guy : -twitches- But, every game has different needs from AI
Marketing Guy : So we'll make it flexible, generic, so it can do anything

Technical Guy : If it's generic processor design, it's the same as a regular CPU.
Marketing Guy : Exactly!

Technical Guy : But then what is it's advantage?
Marketing Guy : Haven't you been listening? It'll make games play BETTER!

My bet: it won't fly (4, Insightful)

archeopterix (594938) | more than 7 years ago | (#16054464)

I think it won't repeat the success of 3d acceleration, because AI is quite unlike 3D. The key factor in 3d accelerators' success is IMHO a very good set of primitives. If you are fast at drawing large numbers of textured triangles potentially obscuring each other, then you are there (almost) - you can accelerate practically any 3d game. I don't see anything like this in AI. Well, perhaps a generic depth-first search accelerator for brute force algorithms, but the problem I see with that is that the search spaces will vary from game to game, so you probably won't be faster than your current multi-core generic CPU.

It seems that those guys did what's best under these circumstances - got a specific search space that is common in many games and specialized in that. IMHO, it's not enough to get the snowball rolling, but time will tell.

Not for home use (0)

Dachannien (617929) | more than 7 years ago | (#16054465)

The obvious application for this technology is in MMOG servers, not in desktop machines.

No, the cat does not "got my tongue." (2, Informative)

Impy the Impiuos Imp (442658) | more than 7 years ago | (#16054588)

> In fact, AIseek guarantees that with its coprocessor NPCs
> will always be able to find the optimal path in any title using the processor.

It has been mathematically demonstrated there is no general pathfinding solution significantly better than trying all possibilities (though pretty much only in degenerate cases could the best path be difficult not to find by a hill-climbing heuristic.)

Still, it should be trivial to whip up a case that would require these dedicated processors longer than the known age of the universe to find the optimal path.

Re:No, the cat does not "got my tongue." (1)

Hast (24833) | more than 7 years ago | (#16055152)

What? A* finds the optimal solution as long as the heuristic is valid. There are better algorithms of course, but to claim that you might as well brute force is just silly.

Or perhaps I just completely misunderstood your statement?

amd's HyperTransport (1)

Joe The Dragon (967727) | more than 7 years ago | (#16054650)

This will be a big win for Hyper Transport based co coprocessors / cards if amd can make real easy for chips like this to have a hyper transport bus and would force intel to use Hyper Transport.

By the time intel gets to there own hyper transport like bus there may be a lot of hyper transport cards / coprocessors out there and It would be a vear bad move for intel to try to push there own csi bus.

Custom hardware? already got that. (1)

KE1LR (206175) | more than 7 years ago | (#16054984)

Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?

Hmm, I thought we had that already... What about all those game consoles with custom video chips and CPU's in them? (PS1, PS2, Xbox, Xbox 360, Gamecube...)

IMO, this chipset (or at least its functionality) may be more likely to find a home in consoles than as add-on's for PC systems.

As others have pointed out, the number of people who would drop an extra $100 to get the last erg of performance out of their gaming system is pretty small.

I've seen it in action (0)

Anonymous Coward | more than 7 years ago | (#16055042)

It's real; They already have a chip manufactured which they demonstrated in recent shows.

They have a playable game that looks just like one of the videos on the site (The one with the soldiers; although the game graphics looked a bit different when I played it), in which you control the Tank, and the soldiers act intelligently.

Destroying the city is extremely fun, and it was the first time ever that a large group of AI-controlled agents looked intelligent to me.

Hope they make it to the market soon. If game studios use this coprocessor properly, games are going to become much realistic, and much less micromanaged.

Next up, (0)

Anonymous Coward | more than 7 years ago | (#16055072)

... a chip that plays the games for you, so you dont have to!

Games never needed sophisticated AI (1)

Astarica (986098) | more than 7 years ago | (#16055109)

You never needed a serious firepower for a computer opponent that pretends to be actually doing something before ultimately rolling over. To think that humans can even compete against computer in games that tests physical accuracy (computer is infinite more accurate than any human) or multitasking is simply foolish.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>