gpgpu ai

Fd80f81596aa1cf809ceb1c2077e190b
0
rouncer 104 Feb 23, 2011 at 18:38

I just had this crazy idea, do you think it would work?

what if you sequentially stored byte code in a texture, and “interpreted” it on the gpu…

Pass a texture for the variable data (that the cpu modifies) and pass a texture for the actual script.

Im thinking mostly ai/actor code you would use this for.

how big a texture would you need, and would this be feesable?

5 Replies

Please log in or register to post a reply.

A8433b04cb41dd57113740b779f61acb
0
Reedbeta 167 Feb 23, 2011 at 18:44

Maybe interesting if you have a very large number of agents (like 1000s) that are executing relatively simple behaviors and have relatively little state.

You’d also have to consider how to get the data out of the texture to make the agents do their things, e.g. playing animations and moving around. That stuff is usually done on the CPU, but you don’t want to do readbacks from video memory.

People have proposed this idea before for particle systems. The idea is each texel represents one particle, and you have a set of textures storing position, velocity, etc, and you use vertex program texture fetches to render the particles in the world.

417da4ebe698388116f7220a0ee9fbc2
0
marcgfx 101 Feb 23, 2011 at 18:46

more importantly is how do you handle the information on the texture? you would also have to interpret the script. not saying it’s not possible, i couldn’t do it.

46407cc1bdfbd2db4f6e8876d74f990a
0
Kenneth_Gorking 101 Feb 23, 2011 at 19:20

I don’t know if scripting on the GPU is feasable, because it usually involves a lot of callbacks into the engine, but GPU AI is possible.

Fd80f81596aa1cf809ceb1c2077e190b
0
rouncer 104 Feb 23, 2011 at 20:35

Hmm. I didnt think about readbacks…
Ive thought a little more, and I think you could implement it by rendering a single point per entity, you could handle more than one entity at once tho for sure.
Maybe if you used the geometry shader, and outputted a pixel sized point (i know this is getting a little nuts hehe) for each “actor arm” you could set so many arms to render in a small texture to read back into the cpu. so say you used a 1x8 texture, you tell it to render “movement distance” and “animation frame” then whatever else you needed on different pixels in this “gpu output” texture… then read it back.

youd need…

CPU TO GPU
1 by x variable texture (possibly 32 bit floats)

OPERATE ON GPU (prestored in video ram)
256x256 script texture, youd have to organize how you would like the byte code.

GPU TO CPU (maybe a floating point or byte texture would do, cause reading back is really slow isnt it…)
1 by x output texture, which is written by the geometry shader for explicit points on different pixels.

well, im crazy enough to think it might work, but how extensive the scripting language could be carried out in the geometry shader I dont really know… maybe your right Reedbeta and it suits simple ai with lots of entities, but maybe you wouldnt get that many entities because there would be too many read backs, dunno.

The cool thing is, it would make plugging in ai really easy, just add a shader… thats what really sparked me up about the idea.

Fd80f81596aa1cf809ceb1c2077e190b
0
rouncer 104 Feb 23, 2011 at 20:49

marcgfx, youd have to implement a calculator in the shader… sound impossible? :)

just imagine reading 2 variables and a operator in, you could make it all a step by step procedure, one operation at a time.

you could even have a “goto” command, you just change the texture coordinate instead which stands for the instruction pointer.