A (challenging?) question around the Jacobian matrix

In summary, The conversation discusses the invertibility of the matrix-valued function H(x) = \int_0^1 \partial_i g^j(tx) dt, where g is a diffeomorphism on R^n and its jacobian matrix is everywhere invertible. The question is whether (H_{i,j}(x))_{i,j} (which could be interpreted as a mean of the Jacobian matrix of g) is invertible for any x. The conversation provides a counterexample in three dimensions, but it is believed that the theorem is true in two dimensions due to the nature of coordinate lines and their tangents. A proof is provided that shows the non-singularity of H in two dimensions.
  • #1
Aleph-0
12
0
Here is the problem:

Suppose that g is a diffeomorphism on R^n. Then we know that its jacobian matrix is everywhere invertible.
Let us define the following matrix valued function on R^n
[tex]
H_{i,j} (x) = \int_0^1 \partial_i g^j(tx) dt
[/tex]
where [tex]g^j[/tex] are the components of g.
Question : Is [tex](H_{i,j}(x))_{i,j} [/tex] (which could be interpreted as a mean of the Jacobian matrix of g) invertible for any x ?

My guess is that the answer is negative, but I find no counter-examples.
Any Help ?
 
Physics news on Phys.org
  • #2
I believe you may have to move into three dimensions to obtains a counterexample. It is possible that this theorem is in fact true.

Edit:
Actually, I believe that a counterexample is provided by;

[tex]\mathbf{g}((x,y,z))=(x \cos(2\pi z)-y \sin(2\pi z),x \sin(2\pi z)+y \cos(2\pi z),z)[/tex]
with
[tex]H((0,0,1))[/tex] being singular.

Can anyone else check this?
 
Last edited:
  • #3
Indeed, we even have H(x,y,1) singular for any x,y !
Thank you !
I wonder if the theorem is true in dimension 2...
(it is trivially true in dimension 1).
 
  • #4
Interesting problem. I thought about this for awhile, and couldn't come up with a counter in D =2. Can anyone prove this statement? Its not clear to me why it should be true, and there seems to be something interesting (maybe topological) lurking behind it.
 
Last edited:
  • #5
Haelfix said:
Interesting problem. I thought about this for awhile, and couldn't come up with a counter in D =2. Can anyone prove this statement? Its not clear to me why it should be true, and there seems to be something interesting (maybe topological) lurking behind it.
I believe it is true because of the nature of coordinate lines and their tangents in 2D.

Firstly, assume that we can find a H in 2D that is singular. Therefore the columns of H, which we'll denote [tex]\mathbf{h}_i[/tex] are linearly dependant. So there exist non zero [tex]\alpha_i[/tex] such that

[tex]\Sigma \alpha_i \mathbf{h}_i = \mathbf{0}[/tex]

Now by the definition

[tex]\mathbf{h}_i = \int_0^1 \frac{\partial \mathbf{g}}{\partial x_i}(t \mathbf{x}) dt[/tex]
Therefore

[tex]\Sigma \alpha_i \mathbf{h}_i =\int_0^1 \Sigma \alpha_i \frac{\partial \mathbf{g}}{\partial x_i}(t \mathbf{x}) dt = \mathbf{0} [/tex]

Now looking at the integral along this curve. We let
[tex]\mathbf{v}(t)=\Sigma \alpha_i \frac{\partial \mathbf{g}}{\partial x_i}(t(1,0)) = \alpha_1 \frac{\partial \mathbf{g}}{\partial x_1}(t(1,0)) + \alpha_2 \frac{\partial \mathbf{g}}{\partial x_2}(t(1,0))[/tex]
And so we have
[tex]\int_0^1 \mathbf{v}(t) dt = \mathbf{0} [/tex]
We cannot have [tex]\mathbf{v}=\mathbf{0}[/tex], because the jacobian is always fully ranked, and hence any linear combination of its columns cannot be zero.

So [tex]\mathbf{v}(t)[/tex] is non zero everywhere, and its integral is zero. But if we let [tex]\mathbf{v}(t)=\frac{d \mathbf{z}}{dt}(t)[/tex], we can see that

[tex]\int_0^1 \mathbf{v}(t) dt = \int_0^1 \frac{d \mathbf{z}}{dt}(t) dt = \mathbf{z}(1)-\mathbf{z}(0) = \mathbf{0}[/tex]

Which means that [tex]\mathbf{v}[/tex] is the tangent vector to some closed curve [tex]\mathbf{z}(t)[/tex]

We now make a change of variables by letting
[tex]\mathbf{x}(\mathbf{u}) = \left( \begin{array}{cc}
\alpha_1 & -\alpha_2 \\
\alpha_2 & \alpha_1
\end{array} \right) \mathbf{u}
[/tex]

Then it can be shown that [tex]\mathbf{v}(t) = \frac{d \mathbf{g}}{d u_1}(t(1,0))[/tex], i.e. [tex]v[/tex] is the tangent vector to the [tex]u_2=0[/tex] coordinate curve. But this means that this coordinate curve is closed. Since the mapping from u to x is a diffeomorphism, and the function g is a diffeomorphism, the function from u to g is a diffeomorphism on R^2. This means that it cannot have any closed coordinate curves. Therefore we have a contridiction, and so H must be non singular.

I hope this is OK. I'm not being especially rigorous here.

In higher dimensions, some of the [tex]\alpha_i[/tex] can be zero, and hence the mapping from u to x may not be a diffeomorphism. Essentially you have some wiggle room once you move into higher dimensions.
 

1. What is the Jacobian matrix and what is its purpose?

The Jacobian matrix is a matrix of partial derivatives that represents the rates of change of a multivariate function. It is used to transform between coordinate systems and to calculate derivatives in higher dimensions.

2. How is the Jacobian matrix used in machine learning and data analysis?

The Jacobian matrix is used in machine learning and data analysis to optimize functions and find the best parameters for a given model. It is also used in gradient descent algorithms to find the minimum or maximum of a multivariate function.

3. What are the properties of the Jacobian matrix?

The Jacobian matrix is a square matrix with dimensions m x n, where m is the number of output variables and n is the number of input variables. It has important properties such as being invertible, and its determinant is used to determine if a function is locally invertible at a given point.

4. Can the Jacobian matrix be used to solve systems of equations?

Yes, the Jacobian matrix can be used to solve systems of equations by setting it equal to the zero vector and finding the values of the input variables that satisfy the equations. This is known as the Jacobian method of solving systems of equations.

5. What are some real-world applications of the Jacobian matrix?

The Jacobian matrix has many applications in fields such as physics, engineering, and economics. It is used in robotics and control systems to model and optimize movements, in economics to model supply and demand curves, and in fluid dynamics to model flows and turbulence. It is also widely used in machine learning and data analysis for optimization and parameter estimation.

Similar threads

Replies
1
Views
1K
  • Special and General Relativity
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
157
  • General Math
Replies
11
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
2
Views
2K
Replies
1
Views
1K
Replies
13
Views
10K
  • Calculus
Replies
9
Views
2K
Back
Top