Missing vital point with stepSimulation

sipickles
Posts: 44
Joined: Thu Aug 07, 2008 6:57 am

Missing vital point with stepSimulation

Post by sipickles »

Hi,

I'd like to discuss btDiscreteDynamicsWorld::stepSimulation.

I thought I had this resolved long ago, but timing issues have reared their ugly head once again. In particular, my server and client simulations are running at very different rates.

I noticed my frame rate on the client was down so I turned on my profiler. This showed that all the lost time was being spend calling:

Code: Select all

m_dynamicsWorld->stepSimulation( timeDeltaMs * 0.001f, 10 ); // expects timeDelta in seconds
I was calling this function 30 times a second and supplying the time delta since that last call. Profiler said I spend 88ms in this call! If I changed the call to only allow 1 maxSubStep, my FPS returned (profiler says 10ms) but actual simulation movement was slow.

I thought maybe my timer is bad (even tho it worked before) so I switched to btClock specifically timing the time delta between calls to stepSimulation, and upped the update rate to 60Hz. It better but movement is still not right on client.

The client is a Win7 build running directX, with a parallel sim taking place on a Ubuntu Linux server. I can turn on rendering of the incoming server updates and it is clear the client is dragging behind the correct speed shown on server (server profiler: 1ms for same call to stepSim!). The client is constantly snapping to the position of the server because the sims are so different. Its not a network issue. This used to work!

How should I be correctly using stepSim. In particular I am puzzled why the maxSubSteps parameter affect speed when the loop is already fast.

Thanks for any advice.

Simon
User avatar
Erwin Coumans
Site Admin
Posts: 4221
Joined: Sun Jun 26, 2005 6:43 pm
Location: California, USA

Re: Missing vital point with stepSimulation

Post by Erwin Coumans »

In particular I am puzzled why the maxSubSteps parameter affect speed when the loop is already fast.
Apparently the loop is not fast enough: if stepSimulation takes longer than real-time, there is a problem that you have to solve.
This effect is also known as the well of despair, and that's why there is a maxSubSteps parameter.
If you set the maxSubSteps to 1, and the stepSimulation (one internal sub step) takes longer than real-time (the delta time you pass in as first argument) you will loose time and notice a slow down.

Does this happen in optimized/release builds or debug/unoptimized builds?

You can check where exactly the time is spend within stepSimulation, and try optimizing the dynamics world taking that into account.
Right after stepSimulation, call a CProfileManager::dumpAll, to see detailed statistics.

Thanks,
Erwin
sipickles
Posts: 44
Joined: Thu Aug 07, 2008 6:57 am

Re: Missing vital point with stepSimulation

Post by sipickles »

Ok, thats the problem. Client on win32 is compiling as debug. Server compiling in release.

That has a huge effect during collision detection , right?

When I compile client in release, it keeps up with server.

How many collision objects should I consider a max in a single world?

What is best way to compartmentalise a larger space? Quadtree with ghost objects providing collisions with objects in neighbouring cells?