Understanding Jacobian in relation to physics

Click For Summary
SUMMARY

This discussion focuses on the application of Jacobians in game physics engines for resolving contact forces. The Jacobian matrix, which contains partial differential coefficients, is essential for handling functions of multiple variables and determining slopes in various directions. A practical example illustrates how the Jacobian can be used to constrain points in a multi-dimensional space, particularly in relation to holonomic functions. Understanding these concepts does not require extensive re-learning of linear algebra, but a solid grasp of the underlying principles is beneficial.

PREREQUISITES
  • Understanding of Jacobian matrices
  • Familiarity with partial differential coefficients
  • Basic knowledge of linear algebra concepts
  • Experience with game physics engines
NEXT STEPS
  • Study the application of Jacobians in physics simulations
  • Learn about holonomic and non-holonomic constraints
  • Explore the use of partial derivatives in multi-variable functions
  • Investigate the implementation of contact resolution algorithms in game engines
USEFUL FOR

Game developers, physics simulation engineers, and anyone interested in applying mathematical concepts to game physics and contact resolution.

cboyce
Messages
5
Reaction score
0
I'm working with a game physics engine that uses Jacobians to resolve contact forces. It's been a few years since my physics and linear algebra classes (where we didn't get to Jacobian matrices), so what I'm reading about Jacobians is fairly overwhelming. Most of what I can find are fairly formal definitions, without any examples about what I'm specifically looking for. Can someone give me a couple simple examples about how Jacobians would apply in physics contact resolution, or point me to a resource that does? Or are they complex enough that I need to relearn all the linear algebra leading up to them to understand how to use them?
 
Physics news on Phys.org
Thanks for the link, I think that gives me a good idea where I need to start to understand them.
 
Don't know if this helps but if you have a simple function of a single variable
such as y = f(x) you can differentiate this to get the slope.
As there is only one direction involved, there is only one slope to chose from.
Variously we write f'(x) or dy/dx etc.

When you are dealing with a function of several variables, as you must be, there are many directions to chose from, all with different slopes available.

The Jacobian is a method of handling this, which is why the matrix contains an array of partial differential coeffiecients. If you like it is a method of resolving the slopes into as many suitable directions as are needed.
Yes linear algebra theory confirms that this is the same number as the number of independent variables.
 
I've been doing quite a bit of reading, and I've been trying to get an intuitive sense of the constraints in play. Per the example of a holonomic function at http://en.wikipedia.org/wiki/Holonomic#Examples", I can intuitively understand that x^2+x^3 - L = 0 constrains a point to someplace on a circle. But then it says given
attachment.php?attachmentid=30060&stc=1&d=1290556566.png
, L is the distance between the two positions ri and rj, but wouldn't ri -rj already give you the distance between the two positions?
 

Attachments

  • 60a65630250a2cf40600a1b4ddcfdbf2.png
    60a65630250a2cf40600a1b4ddcfdbf2.png
    558 bytes · Views: 1,221
Last edited by a moderator:
cboyce said:
But then it says given
attachment.php?attachmentid=30060&stc=1&d=1290556566.png
, L is the distance between the two positions ri and rj, but wouldn't ri -rj already give you the distance between the two positions?

In one dimension, yes. But note the boldface: \vec{r_i} and \vec{r_j} are vectors here, so \vec{r} = \vec{r_i} - \vec{r_j} is the vector specifying the difference between the two, squaring it here is the http://en.wikipedia.org/wiki/Dot_product" of the resulting vector and itself, i.e. the x-component squared plus the y component squared and so on. So this works for any number of dimensions, by the pythagorean theorem \sqrt{x^2 + y^2} is the distance in 2d, \sqrt{x^2 + y^2 + z^2} in 3d and so on.
 
Last edited by a moderator:

Similar threads

Replies
5
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 60 ·
3
Replies
60
Views
6K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 12 ·
Replies
12
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
1
Views
2K
  • · Replies 22 ·
Replies
22
Views
942