Understanding the Inverse of Jacobian Matrices: A Brief Overview

parsesnip
Messages
9
Reaction score
0
Homework Statement
I want to prove that ##\left| \frac{\partial(x,y)}{\partial(u,v)} \right|=\frac{1}{\left|\frac{\partial(u,v)}{\partial(x,y)}\right|}## (If this is true)
Relevant Equations
##\frac{\partial(x,y)}{\partial(u,v)}=\begin{vmatrix} x_u & x_v\\ y_u&y_v \end {vmatrix}##
##\frac{\partial(u,v)}{\partial(x,y)}=\begin{vmatrix} u_x & u_y\\ v_x&v_y \end {vmatrix}##
I got that ##{x_u}{y_v}-{x_y}{y_u}=####\frac{1}{\frac{1}{{x_u}{y_v}}-\frac{1}{{y_u}{x_v}}}##. But this implies that ##{x_u}{x_v}{y_u}{y_v}=-1## and I don't see how that is true?
 
Physics news on Phys.org
Very simple case of u=x, v=y obviously the statement stands. but
x_u x_v y_u y_v = 1*0*0*1=0
So your result seems wrong.

Find 2X2 matrices A, B by partial differentiation
<br /> \begin{pmatrix}<br /> du \\<br /> dv \\<br /> \end{pmatrix}<br /> <br /> = A<br /> <br /> \begin{pmatrix}<br /> dx \\<br /> dy \\<br /> \end{pmatrix}

<br /> \begin{pmatrix}<br /> dx \\<br /> dy \\<br /> \end{pmatrix}<br /> <br /> = B<br /> <br /> \begin{pmatrix}<br /> du \\<br /> dv \\<br /> \end{pmatrix}

So
AB=BA=E
det\ A\ \ det\ B = 1
 
Last edited:
  • Like
Likes parsesnip
By the chain rule <br /> 1 = \frac{\partial x}{\partial x} = \frac{\partial x}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial x}{\partial v}\frac{\partial v}{\partial x} and <br /> 0 = \frac{\partial y}{\partial x} = \frac{\partial y}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial y}{\partial v}\frac{\partial v}{\partial x} and similarly \frac{\partial y}{\partial y} = 1 and \frac{\partial x}{\partial y} = 0. Now I suspect that if you expand <br /> 1 = \frac{\partial x}{\partial x}\frac{\partial y}{\partial y} - \frac{\partial y}{\partial x}\frac{\partial x}{\partial y} and rearrange it then you will obtain your result.
 
  • Like
Likes parsesnip and etotheipi
pasmith said:
By the chain rule <br /> 1 = \frac{\partial x}{\partial x} = \frac{\partial x}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial x}{\partial v}\frac{\partial v}{\partial x} and <br /> 0 = \frac{\partial y}{\partial x} = \frac{\partial y}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial y}{\partial v}\frac{\partial v}{\partial x} and similarly \frac{\partial y}{\partial y} = 1 and \frac{\partial x}{\partial y} = 0. Now I suspect that if you expand <br /> 1 = \frac{\partial x}{\partial x}\frac{\partial y}{\partial y} - \frac{\partial y}{\partial x}\frac{\partial x}{\partial y} and rearrange it then you will obtain your result.
I think I understand. But if ##\frac {\partial x}{\partial u}=\frac{1}{\frac {\partial u}{\partial x}}##, then doesn't ##\frac {\partial x}{\partial u}\frac {\partial u}{\partial x}=1##? Then how do you resolve the contradiction that ##1 = \frac{\partial x}{\partial x} = \frac{\partial x}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial x}{\partial v}\frac{\partial v}{\partial x} = 2##?
 
parsesnip said:
I think I understand. But if ##\frac {\partial x}{\partial u}=\frac{1}{\frac {\partial u}{\partial x}}##, then doesn't ##\frac {\partial x}{\partial u}\frac {\partial u}{\partial x}=1##? Then how do you resolve the contradiction that ##1 = \frac{\partial x}{\partial x} = \frac{\partial x}{\partial u}\frac{\partial u}{\partial x} + \frac{\partial x}{\partial v}\frac{\partial v}{\partial x} = 2##?

The relation \frac{\partial x}{\partial u} = \left(\frac{\partial u}{\partial x}\right)^{-1} does not hold in general. For example, with plane polar coordinates we have \frac{\partial r}{\partial x} = \frac xr and \frac{\partial x}{\partial r} = \cos \theta = \frac xr \neq \frac rx.

Rather, if that relation does hold then it implies that x is independent of v so that \frac{\partial x}{\partial v} = 0.
 
Further to my post #2

A=<br /> \begin{pmatrix}<br /> u_x &amp; u_y \\<br /> v_x &amp; v_y \\<br /> \end{pmatrix}
B=<br /> \begin{pmatrix}<br /> x_u &amp; x_v \\<br /> y_u &amp; y_v \\<br /> \end{pmatrix}
AB=<br /> \begin{pmatrix}<br /> u_x &amp; u_y \\<br /> v_x &amp; v_y \\<br /> \end{pmatrix}<br /> \begin{pmatrix}<br /> x_u &amp; x_v \\<br /> y_u &amp; y_v \\<br /> \end{pmatrix}<br /> =<br /> \begin{pmatrix}<br /> 1 &amp; 0 \\<br /> 0 &amp; 1 \\<br /> \end{pmatrix}

It meets with post #3.

A=B^{-1}
<br /> \begin{pmatrix}<br /> u_x &amp; u_y \\<br /> v_x &amp; v_y \\<br /> \end{pmatrix}<br /> =\frac{1}{x_uy_v-x_vy_u}<br /> \begin{pmatrix}<br /> y_v &amp; -x_v \\<br /> -y_u &amp; x_u \\<br /> \end{pmatrix}

For an example (1,1) component says

u_x=\frac{1}{x_uy_v-x_vy_u}y_v

You see in order ##u_x =\frac{1}{x_u}## as you expect, ##x_vy_u=0 ## and ##y_v \neq 0## is required.
 
Last edited:
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top