Calculate the Jacobian of this function

mobe
Messages
19
Reaction score
0

Homework Statement


Could you please help me with this problem?

Let f(x) = (f_1(x), f_2(x)) map R^{2} into itself where f_1, f_2 have continuous 1st/ 2nd partial derivatives in each variable. Assume that f maps origin to itself and that J_f(x)(Jacobian matrix) is an invertible 2x2 matrix for all x. Put g(x) = x - f'(x)^{-1}.f(x)

(i) Explicitly compute J_g(x) using relation J_f^{-1}. J_f = Identity matrix I_{2}


Thanks in advance!


Homework Equations




The Attempt at a Solution


What are f'(x)^{-1} and f'(x)^{-1}.f(x)
? I am just having trouble with the notations.
Can you give me hints?
( For some reason, I can't you tex right?)^{}
 
Last edited:
Physics news on Phys.org
Well firstly, I don't think you've entered your question correctly. If f\in C^2(\mathbb{R}^2), \quad f:\mathbb{R}^2\rightarrow \mathbb{R}^2 then it's more likely that f(x,y) = \displaystyle \left( f_1 (x,y), f_2 (x,y) \right). If f were indeed simply a function of x, then your Jacobian matrix would be singular.

Now your Jacobian matrix for f will look like

Jf = \begin{pmatrix} \partial_x f_1 & \partial_y f_1 \\ \partial_x f_2 & \partial_y f_2 \end{pmatrix}

Then you don't need to explicitly calculate f^{-1} (x,y) [/tex] since you only need it&#039;s Jacobian matrix. <br /> <br /> Jf^{-1} = \frac{1}{\partial_x f_1 \partial_y f_2 - \partial_x f_2 \partial_y f_1} \begin{pmatrix} \partial_y f_2 &amp;amp; -\partial__y f_1 \\ -\partial_x f_2 &amp;amp; \partial_x f_1 \end{pmatrix}.<br /> <br /> Is f^\prime (x) ^{-1} \cdot f(x) the dot product? If it is then g:\mathbb{R}^2 \rightarrow \mathbb{R} \text{ and } Jg \in M_{1\times 2} (\mathbb{R} )
 
Thanks for the reply!
I got up to g = x - f'(x)Click to see the LaTeX code for this image.f(x) in term of J_f and J_f^-1. I want to compute J_g but continue doing the same thing ( taking partial derivatives) would make J_g look awfully ugly. I am curious if there is another way to get J_g? I got a hint: (J_f)^(-1).J_f = Identity matrix I_2 but I do not know how to use it.
 
Last edited:
Kreizhn said:
Well firstly, I don't think you've entered your question correctly. If f\in C^2(\mathbb{R}^2), \quad f:\mathbb{R}^2\rightarrow \mathbb{R}^2 then it's more likely that f(x,y) = \displaystyle \left( f_1 (x,y), f_2 (x,y) \right). If f were indeed simply a function of x, then your Jacobian matrix would be singular.

Now your Jacobian matrix for f will look like

Jf = \begin{pmatrix} \partial_x f_1 &amp; \partial_y f_1 \\ \partial_x f_2 &amp; \partial_y f_2 \end{pmatrix}

Then you don't need to explicitly calculate f^{-1} (x,y) [/tex] since you only need it&#039;s Jacobian matrix. <br /> <br /> Jf^{-1} = \frac{1}{\partial_x f_1 \partial_y f_2 - \partial_x f_2 \partial_y f_1} \begin{pmatrix} \partial_y f_2 &amp;amp; -\partial__y f_1 \\ -\partial_x f_2 &amp;amp; \partial_x f_1 \end{pmatrix}.<br /> <br /> Is f^\prime (x) ^{-1} \cdot f(x) the dot product? If it is then g:\mathbb{R}^2 \rightarrow \mathbb{R} \text{ and } Jg \in M_{1\times 2} (\mathbb{R} )
<br /> <br /> Yes, I believe that it is the dot product.
 
mobe said:
Thanks for the reply!
I got up to g = x - f'(x)Click to see the LaTeX code for this image.f(x) in term of J_f and J_f^-1. I want to compute J_g but continue doing the same thing ( taking partial derivatives) would make J_g look awfully ugly. I am curious if there is another way to get J_g? I got a hint: (J_f)^(-1).J_f = Identity matrix I_2 but I do not know how to use it.

Just to move it down here.
 
I'm pretty sure you should be able to decompose g(x) over it's sums. That is

g(x) &amp;=&amp; x - \frac{df^{-1}}{dx} f(x) \\
=x-f_1 \frac{df_1^{-1}}{dx} - f_2 \frac{df_2^{-1}}{dx}

Now I'm not 100% sure about this, but I think that in your case, since g:\mathbb{R}^2 \rightarrow \mathbb{R} \text{ and } Jg \in M_{1\times 2} (\mathbb{R} )

That you can say J(f+g) = J(f) + J(g) \text{ and } J(fg) = (Jf)g+f(Jg) [/tex] and just substitute f and g with the appropriate functions. Play around with it and see what happens. <br /> <br /> Edit: I&#039;m not sure if this will hold in higher dimensions, but I think it holds in M_{1\times 2} (\mathbb{R} ) since the Jacobian just ends up being the gradient of each scalar function. Not sure if that helps you at all.
 
Last edited:
I think <br /> g:\mathbb{R}^2 \rightarrow \mathbb{R} \text{ and } Jg \in M_{1\times 2} (\mathbb{R} ) <br /> is right. I will use your suggestion and see where it gets me. Thanks! ( I may post more question).
 
Back
Top