Hi,
I've a problem: I've done a simulation but it's too heavy.
Is there a way to use more computing power? Something like GPU...
Thanks
Simulation too slow (use of GPU)
-
- Posts: 463
- Joined: Fri Nov 30, 2012 4:50 am
Re: Simulation too slow (use of GPU)
Bullet is typically single threaded, but there's multithreaded demos in the source (though they lack great documentation). If you haven't switched to it, might as well try.
Is your calculation so difficult that you need more power than the multithreaded code? If that's the case, you can wait til Bullet 3's GPU pipeline comes out, but you'll have to wait a long while though, been two months since last public update on the progress. The code's up at https://github.com/erwincoumans/bullet3 if you are feeling adventurous, but I personally haven't tried it yet since documentation is practically non-existent
Is your calculation so difficult that you need more power than the multithreaded code? If that's the case, you can wait til Bullet 3's GPU pipeline comes out, but you'll have to wait a long while though, been two months since last public update on the progress. The code's up at https://github.com/erwincoumans/bullet3 if you are feeling adventurous, but I personally haven't tried it yet since documentation is practically non-existent