PAL : Physics engine comparison : Source code and results

Post Reply
aboeing
Posts: 33
Joined: Tue Jul 26, 2005 2:28 pm
Contact:

PAL : Physics engine comparison : Source code and results

Post by aboeing »

Hi All,

I have uploaded most of the source code to the physics engine comparison program that won the Eurographics prize. (The code is not very clean, it was a bit of a rush job, in CVS: http://sourceforge.net/projects/pal)

You can download the video and an executable here:
http://www.adrianboeing.com/pal/benchmark.html#irrdemo

The benchmark set is briefly described here:
http://www.adrianboeing.com/pal/benchma ... #benchmark

And some of the result data can be graphed here:
http://www.adrianboeing.com/pal/pal_bench_graph.html

I would definately welcome any comments or suggestions, and of course critiques are welcome too.
Antonio Martini
Posts: 126
Joined: Wed Jul 27, 2005 10:28 am
Location: SCEE London

Re: PAL : Physics engine comparison : Source code and results

Post by Antonio Martini »

aboeing wrote: And some of the result data can be graphed here:
http://www.adrianboeing.com/pal/pal_bench_graph.html
it doesn't seem to work, i just see an empty "graph". Am i missing something?

cheers,
Antonio
topcomer
Posts: 31
Joined: Thu Sep 21, 2006 1:53 pm
Location: sweden but italian

Re: PAL : Physics engine comparison : Source code and results

Post by topcomer »

AntonioMartini wrote:
aboeing wrote: And some of the result data can be graphed here:
http://www.adrianboeing.com/pal/pal_bench_graph.html
it doesn't seem to work, i just see an empty "graph". Am i missing something?

cheers,
Antonio
in my case I missed that it was check the engines you do NOT want to include.
aboeing
Posts: 33
Joined: Tue Jul 26, 2005 2:28 pm
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by aboeing »

AntonioMartini wrote: it doesn't seem to work, i just see an empty "graph". Am i missing something?
Hi Antonio,

Can you tell me which graph you were trying to create and which browser you are using?

Thanks.
Antonio Martini
Posts: 126
Joined: Wed Jul 27, 2005 10:28 am
Location: SCEE London

Re: PAL : Physics engine comparison : Source code and results

Post by Antonio Martini »

aboeing wrote:
AntonioMartini wrote: it doesn't seem to work, i just see an empty "graph". Am i missing something?
Hi Antonio,

Can you tell me which graph you were trying to create and which browser you are using?

Thanks.
im using IE7 and i tried different graphs and none of them works. i will try with my comp at home.

cheers,
Antonio
User avatar
Erwin Coumans
Site Admin
Posts: 4221
Joined: Sun Jun 26, 2005 6:43 pm
Location: California, USA
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by Erwin Coumans »

Hi Adrian,

Some requests about the benchmark (graph):

- Can you add the version and/or download date for each physics engine?
The benchmark is just a snapshot in time, and engines will likely improve, so it is good to see which version is used.

- Document the non-default settings used in the benchmark on the webpage. For example, did you use quickstep or worldstep in ODE?

- Download information how you obtained each engine.

- Adding convex mesh/hull collision tests. Perhaps other collision shapes, like heightfield, cylinders, capsules, compounds.

- ragdolls tests, just using unlimited point to point constraints, sliding on a sloped 3d triangle mesh

- providing COLLADA files for each test.

- TrueAxis cannot be disabled. Perhaps it is better to choose which engine to include, instead of exclused (it is a bit confusing).

Thanks for sharing this work,
Erwin
Erin Catto
Posts: 316
Joined: Fri Jul 01, 2005 5:29 am
Location: Irvine
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by Erin Catto »

This is great! Too bad Havok is not represented (not your fault).

I remember Game Developer magazine did a physics engine comparison a long time ago. I think the community has really needed something like this to make the quality and performance differences more obvious and objective. It also gives a nice target and sanity check for in-house developers.

I found many of your results surprising, based on my prior assumptions about the quality of various engines.

It does seem that your are using ODE's slow solver. You should probably use the quickstep solver to put it on equal footing.

I don't think these tests need to be fancy. Very simple scenarios, like box stacking and bridges, can tell you a lot about a physics engine.

Finally, I like that this is open source. This will let the respective physics engine teams help you tweak your settings to make their engine run as well as possible.
ngaloppo
Posts: 11
Joined: Wed Dec 06, 2006 1:59 am
Location: Chapel Hill, NC

Re: PAL : Physics engine comparison : Source code and results

Post by ngaloppo »

Hi!

I agree with Erin, this is very useful stuff, especially for sanity-checking! Thanks for all the efforts! I have a few comments:

* Collision benchmark: it seems that only 3 engines are represented in the graphs?
* For the Materials benchmark: Is it possible to plot the 'physically correct' analytical path as well?
* Suggestion for another graph: stacking benchmark, but plot the error (e.g. in terms of total penetration etc...)
* Maybe use some plot ticks as well, so the different lines are better distinguishable?

Thanks again!

--nico
Antonio Martini
Posts: 126
Joined: Wed Jul 27, 2005 10:28 am
Location: SCEE London

Re: PAL : Physics engine comparison : Source code and results

Post by Antonio Martini »

ngaloppo wrote:Hi!

I agree with Erin, this is very useful stuff, especially for sanity-checking! Thanks for all the efforts! I have a few comments:

* Collision benchmark: it seems that only 3 engines are represented in the graphs?
* For the Materials benchmark: Is it possible to plot the 'physically correct' analytical path as well?
* Suggestion for another graph: stacking benchmark, but plot the error (e.g. in terms of total penetration etc...)
* Maybe use some plot ticks as well, so the different lines are better distinguishable?

Thanks again!

--nico
I believe that nowdays most of the engines would give reasonable stacking with a realistic number of boxes you can find in a game, however not many support stiff motors/constraints and continuos collision detection. So an engine stacking 10 times faster than any engine in a situation that will never happen in our application while missing some _crucial_ features required for building a modern game it would still be the inferior one. A physics engine nowdays is expected to be reasonably fast and feature complete. I mean robust features not just a tick on the name. So "feature completeness" should be somehow considered.

cheers,
Antonio
aboeing
Posts: 33
Joined: Tue Jul 26, 2005 2:28 pm
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by aboeing »

Hi everyone, thank you for your replies - a lot of good stuff! I've been a bit sick so I haven't been here recently.

I have added a basic COLLADA loader to PAL, based on John Ratcliffs excellent light weight loader for PhysX:
http://pal.sourceforge.net/
im using IE7
Unfortunately I dont have IE7, and don't plan on installing it, so I'll put up a notice about that. Thanks for the report.
- Can you add the version and/or download date for each physics engine?
I'm working on adding this so it will automatically include it when the benchmark is run. This should make it easier to keep things up to date.
- Document the non-default settings used in the benchmark on the webpage. For example, did you use quickstep or worldstep in ODE?
I think this would be too difficult (I used worldstep), but I will add a note about this.
- Download information how you obtained each engine.
This is available on the engines list page.
- Adding convex mesh/hull collision tests. Perhaps other collision shapes, like heightfield, cylinders, capsules, compounds.
Great suggestion, I'll add this.
The metric I would report would be computational efficiency, do you have any additional ideas?
- ragdolls tests, just using unlimited point to point constraints, sliding on a sloped 3d triangle mesh
A nice idea, but I'm not sure what metric to report here? Do you think finding the distance between links ALA the bridge link test would be sensible? Would this test reveal any more information about the constraints system than the bridge test does?
- providing COLLADA files for each test.
Comming soon! I've included the PAL COLLADA loader now.
- TrueAxis cannot be disabled. Perhaps it is better to choose which engine to include, instead of exclused (it is a bit confusing).
Thats me being lazy with the python script :) I'll see if I can't change that then :)
* Collision benchmark: it seems that only 3 engines are represented in the graphs?
Thats because the other engines do not pass this test! (You will see what happens if you compile and run the binary version of the test)
* For the Materials benchmark: Is it possible to plot the 'physically correct' analytical path as well?
Not easily (AFAIK - feel free to correct me!), but the ideal bounce hight can be calculated, there will be more info on this soon..
* Suggestion for another graph: stacking benchmark, but plot the error (e.g. in terms of total penetration etc...)
This would require a bit of collision detection code, does anyone have a simple convex-object/point test? (otherwise I'll write my own, not that hard).
But I will see if I can add this.
* Maybe use some plot ticks as well, so the different lines are better distinguishable?
The tests produce CVS files, so you can replot them in another package if you like. The online version is just to give you a general idea of it all.
not many support stiff motors/constraints
I hope to add a motor benchmark test soon too.
So "feature completeness" should be somehow considered.
A table of features will also be added soon..
User avatar
Erwin Coumans
Site Admin
Posts: 4221
Joined: Sun Jun 26, 2005 6:43 pm
Location: California, USA
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by Erwin Coumans »

aboeing wrote:
- Adding convex mesh/hull collision tests. Perhaps other collision shapes, like heightfield, cylinders, capsules, compounds.
Great suggestion, I'll add this.
The metric I would report would be computational efficiency, do you have any additional ideas?
In general you should first measure performance for simple cases. Most engines have a special optimized case for box-box, so it is more fair to test convex polyhedron as well.
- ragdolls tests, just using unlimited point to point constraints, sliding on a sloped 3d triangle mesh
A nice idea, but I'm not sure what metric to report here? Do you think finding the distance between links ALA the bridge link test would be sensible? Would this test reveal any more information about the constraints system than the bridge test does?
No, a box-box bridge doesn't test the same. A test of many ragdolls on a 3d triangle mesh is very close to actual game usage, for performance of both collision detection and constraint solver for contacts and joints.

In general, I think you should focus on performance test first. Also, I recommend to work more closely with the physics engine providers, through their usualy support channels. Otherwise you just describe how good your personal ability is to get a physics engine working, without using support. Support is essential to get good quality and performance out of most physics engines.

It would be great to first see a good performance benchmark for simple easy to reproduce cases. Doing a quality comparison between collision detection and rigid body dynamics engines is very complicated.

Hope this helps,
Erwin
aboeing
Posts: 33
Joined: Tue Jul 26, 2005 2:28 pm
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by aboeing »

Hi Erwin,
Thanks for your comments.
In general, I think you should focus on performance test first.
I am a bit hesitant to do too many computational performance-based tests since it will be difficult to quantify the performance overhead incurred by the wrapper itself. ( except for uncontrolled cases, where the additional overhead is just a extra pointer dereference for the update call).
Also, I recommend to work more closely with the physics engine providers, through their usualy support channels.
<begin rant>
I would very much like to, however you might find that the physics engine's providers do not all provide great support. Especially if they know that you are producing a benchmark, they may decide to ignore or hinder you if they feel that your benchmark may come up with unfavourable results.

Would you be as helpfull if Bullet did not perform well? Especiallly if you are entirely financially dependent on the product?

Furthermore it really doesn't make all that much difference, since it is entirely reasonable to assume that other developers would not be able to reach optimal performance either. (People who just 'use' the software rather than 'create' it are never going to be able to make it work as well as the original creators. Knowledge gap.). Do the SPEC benchmarks still have the same validity if you know that chip designers and compiler developers have made specific optimisations just for the test cases in order to get good SPEC scores?
</end rant>

That said, I definately would not knock back any offer of help from the engine developers or the community at large - that is part of the reason why the tests are open source.

I'll send out another round of emails perhaps I'll get a better response this time.
User avatar
Erwin Coumans
Site Admin
Posts: 4221
Joined: Sun Jun 26, 2005 6:43 pm
Location: California, USA
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by Erwin Coumans »

aboeing wrote:I would very much like to, however you might find that the physics engine's providers do not all provide great support. Especially if they know that you are producing a benchmark, they may decide to ignore or hinder you if they feel that your benchmark may come up with unfavourable results.
That is a bit suprising, can you be more specific which engine provider acts like that?
Would you be as helpfull if Bullet did not perform well? Especiallly if you are entirely financially dependent on the product?
Yes, I support your effort, although I have very limited time. Actually Bullet doesn't perform that well. In various physics benchmark I have here, Tokamak, ODE (quickstep), PhysX and Havok all outperform Bullet, except on Playstation 3 and XBox 360, where we can use the parallel optimized version. So there is a lot of work ahead for Bullet optimizations.
Furthermore it really doesn't make all that much difference, since it is entirely reasonable to assume that other developers would not be able to reach optimal performance either. (People who just 'use' the software rather than 'create' it are never going to be able to make it work as well as the original creators. Knowledge gap.).
Support closes this gap. Please reconsider the difference between a game developer using middleware with support, versus one without using support. When I used to work for Havok, I realized support is very important: most game developers use the engine in a very sub-optimal way. So large amount of the price of Havok is due to support.
Do the SPEC benchmarks still have the same validity if you know that chip designers and compiler developers have made specific optimisations just for the test cases in order to get good SPEC scores?
The SPEC is already disturbed as I mentioned, due to special box-box case in many engines, so box-box doesn't reflect the average case. Most game levels contain static concave 3d triangle meshes for the environment (not a flat plane), and objects are not only boxes. That is why I asked for measuring several ragdolls sliding over a 3d concave mesh.
I'll send out another round of emails perhaps I'll get a better response this time.
Email is not the usual Bullet support channel, the Bullet section in this forum is the recommended way, so that others can learn from it.

For Open Dynamics Engine, the ODE mailing list is the main channel. I would expect if you ask if using worldstep versus quickstep is the recommended way for a physics comparison/benchmark, Jon Watte or one of the other maintainers will give you the answer soon (quickstep). Newton has their forums, with active support from Julio and others. I think you already worked a lot with Danny Chapman from jiglib over email, so that seems jiglib support channel. Not sure about Tokamak, the forums seems to be spammed, so perhaps email. OpenTissue also support through their forums, with Kenny Erleben.

Remaining cases are Ageia PhysX, not sure how they get their support for non-commercial game developers. I think they just threw out their engine for free to get more exposure. Havok is only available for professional game companies with a publisher.

Luckily your benchmark is open source, so developers can contribute and improve it.

Anyway, let's not heat up this discussion with rants, I appreciate your work and I wanted to give you some feedback.
Hope this helps,
Erwin
aboeing
Posts: 33
Joined: Tue Jul 26, 2005 2:28 pm
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by aboeing »

Hi Erwin,
Sorry I think that last post was a bit unclear. I certainly did not mean it to be directed at you ( or in fact anyone in particular), rather (a very poor attempt) to point out some of the issues in the development of this project.
That is a bit suprising, can you be more specific which engine provider acts like that?
I’m not sure of that discussing this in a public forum is most appropriate, but I will let you know “off the record” via eg:PM if you would like to know which providers are unhelpful and/or “hostile”.
Would you be as helpfull if Bullet did not perform well?
This was actually meant as a rhetorical question, I actually feel that you (and the Bullet team/community) have supported my development efforts quite a lot and am very grateful for your help. I in no way meant this in a negative manner, rather just to illustrate some of the possible issues for the engine providers who are not so helpful. (ie: They make their living from their product, hence do not wish to support any initiative that would either a) demonstrate inferior performance, or b) reduce the confusopoly about their product, thus hurting their income. Or, for those who don't like conspiracy theories, I guess its possible that since I'm not a paying customer, they aren't helping me.)
Actually Bullet doesn't perform that well.
This depends on your viewpoint, if you consider that Bullet out performs a number of commercial offerings, than you might say it performs very well if price/performance is a concern. In terms of computational performance Bullet perhaps is not leading the pack on the PC, however in terms of “correctness” Bullet performs very well, second best in restitution, good in friction, third best in constraint accuracy, and one of the only engines to pass the collision system test. Plus it is very easy to use!
most game developers use the engine in a very sub-optimal way
As you mentioned, I certainly don’t want to get off topic on this, however I think your statement does add some validity to my claim concerning the “knowledge gap” between physics engine “users” and “suppliers”.

I should mention that most of the tests so far have been concerned with the accuracy of the engine, rather than the run-time performance. (eg: hence worldstep rather than quickstep, etc. but I will ask.) Since these measurements are more “static”, whereas the computational efficiency measurements are somewhat less meaningful due to the large variance in the configurations (ie: CPU, RAM, OS, compiler, etc.) But since people seem to be interested in the computational efficiency I’ll try and add more things in that line.
I appreciate your work and I wanted to give you some feedback.
Thanks, I certainly appreciate your feedback, sorry if the last post seemed to imply anything else. Thanks for the list of the best places to get support for each engine.
*disclaimer*I’m writing this late at night, after beer :)
aboeing
Posts: 33
Joined: Tue Jul 26, 2005 2:28 pm
Contact:

Re: PAL : Physics engine comparison : Source code and results

Post by aboeing »

Okay, I've finally got around to updating the online graphs, so they should be a bit more up-to-date now.
- Can you add the version and/or download date for each physics engine?
Version numbers are there now for the engines that support it.
* Suggestion for another graph: stacking benchmark, but plot the error (e.g. in terms of total penetration etc...)
This is up there now.
- Adding convex mesh/hull collision tests. Perhaps other collision shapes, like heightfield, cylinders, capsules, compounds
This is now there in the form of a 'stress' test. Lots of capsules, convex objects, spheres and boxes are all dropped in to a container.
Post Reply