I am simulating an object falling under gravity from a height.
I set the object to fall from 20 units up, with gravity set to 10. The units I am using are the default, which I understand to be meters and seconds (I don't set them).
I believe it ought to take my object 2 seconds to fall to the ground from 20 meters up. However, it seems to take way longer, (over 10 seconds) - so I'm certain I'm doing something wrong!

I call StepSimulation(0.1) , leaving the maxSubSteps and fixedTimeStep to be their default. From the docs, it seems like it should be ok.
If anyone can shed any light on what I may be doing wrong, I would be most grateful
Best Regards
Matt Taylor