Basically, it's all here: Bullet Framerate Independence Stuff.
Sourcecode, graphs, and a vague conclusion, mostly supporting comment from Erwin, here:
Anyways. Just thought it might be of interest, plus I was hoping to ask a question or two...You can decrease the inner timestep to improve accuracy in
collision handling. If you pass the real timestep, then make sure
maxNumSubSteps is large enough so that stepTime < maxNumSubSteps *
internalTimeStep. Otherwise the simulation will drop frames. So you can
increase maxNumSubSteps to a very large value. Alternatively just pass
'0' to maxNumSubSteps and pass a very small stepTime.
Why does maxNumSubSteps=0 appear to almost always works perfectly... except when internalTimeStep<stepTime?
Why do all the graphs suddenly appear to break just under 1/200 internalTimeStep? I suspect this one's the fault of my crappy code somewhere, but not sure.