# Proving basic linear ODE results

Tags:
1. Feb 28, 2017

### MxwllsPersuasns

1. The problem statement, all variables and given/known data

Please bear with the length of this post, I'm taking it one step at a time starting with i)

Let A: I → gl(n, R) be a smooth function where I ⊂ R is an interval and gl(n, R) denotes the vector space of all n × n matrices.

(i) If F : I → gl(n, R) satisfies the matrix ODE F' = F A , then det F satisfies the scalar ODE (det F)' = tr A det F. Here tr B denotes the trace (the sum of the diagonal elements) of an n × n matrix B.

(ii) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and for some t0 ∈ I we have F(t0) ∈ GL(n, R), where GL(n, R) denotes the group of invertible matrices, then F : I → GL(n, R).

(iii) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and tr A = 0 then det F(t) is constant in t. In particular, if det F(t0) = 1 then det F(t) = 1 for all t ∈ I.

(iv) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and A: I → SO(n, R) takes values in skew-symmetric matrices, then FT(t)F(t) is constant in t. In particular, if F(t0) ∈ O(n, R) then F : I → O(n, R). The analogous statement holds for F(t0) ∈ SO(n, R).

2. Relevant equations

Not sure; I guess the (scalar) derivative of a matrix would equal that matrix with derivatives wrt to that scalar of each element of the original matrix would be something to use.

3. The attempt at a solution

So I'm guessing the way this is worded that basically we just need to demonstrate each of these facts. So my attempt for part i) was started by assuming we are taking the derivative of the matrix F with respect to some variable (I used t). I took F and A to each be 2 x 2 matrices and had the standard ƒ11 in the upper left corner for F and following through to α22 in the right hand bottom corner for A. Thus each element of A is of the form αij(t) and each element of F is of the form ƒij(t) with each being parameterized by the variable t and (i,j) ∈ {1,2}.

So where I'm running into issues is that I'm not sure if; 1) I'm supposed to show the first ODE by differentiating F and showing it equals the original matrix multiplied by another (A), in which case I'm stuck since differentiating F only yields the same matrix but every element is differentiated.. it's not really like just multiplying by another matrix. Or if 2) I'm supposed to solve the matrix ODE in an analogous way to the way that a regular ODE (f'(x) = γf(x)) in which case I'm not sure how exactly I do that? Do I just treat them as a system of two equations (right? 1 for each row) in one variable and solve that way?

Any help along this journey of ODE discovery is immensely appreciated!

2. Mar 1, 2017

### Staff: Mentor

Actually seeing the matrices involved would be helpful. In the 2x2 case you have this for the matrix F(t):
$F(t) = \begin{bmatrix}f_{11}(t) & f_{12}(t) \\ f_{21}(t) & f_{22}(t) \end{bmatrix}$
and similarly for A(t):
$A(t) = \begin{bmatrix}a_{11}(t) & a_{12}(t) \\ a_{21}(t) & a_{22}(t) \end{bmatrix}$
For part i), you're given that F satisfies the equation F' = F A. Can you show that |F|' equals tr(A) |F|?

3. Mar 1, 2017

### MxwllsPersuasns

Okay so basically what I did was multiply F and A (I won't write out the matrix here) then take Det(FA) (since thats det(F)') and I ended up getting (after some cancellations) something like (and I won't use subscripts here as it would take too long so please forgive me)

Det(FA) = {f11*α11*f22*α22 + f12*α21*f21*α12} - {f11*α12*f22*α21 + f12*α22*f21*α11}

and so then I try calculating det(F)*tr(A)...

|F|*tr(A) = {f11*f22 - f12*f21}*(α11 + α22} but here I can already see that I'll get terms with only 3 things being multiplied (rather than four per term in the Det(FA)) I'm not sure if maybe I did something wrong with my computations (I doubled-checked and feel confident) or perhaps I'm missing some cancellation somewhere but I just don't see how to get these two things to equal one another. Any suggestions?

4. Mar 1, 2017

### MxwllsPersuasns

Also for number ii) is this basically saying that if there's a point t_0 for which F belongs to the group of invertible matrices (and thus is invertible at that point) and it satisfies F' = FA then F is an invertible matrix? I'm not sure how I could argue that. Or if I'm totally off base here can someone please assist?