How Do Matrix ODEs Relate to Determinants and Traces?

Click For Summary

Homework Help Overview

The discussion revolves around matrix ordinary differential equations (ODEs) and their relationships with determinants and traces. The original poster presents a series of statements regarding a smooth function F that maps an interval into the space of n × n matrices, and how it satisfies a specific matrix ODE involving another matrix A. The focus is on understanding the implications of these statements, particularly how the determinant of F relates to the trace of A.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the differentiation of matrices and the implications of the matrix ODE F' = F A. There are attempts to express the determinant of F in terms of the trace of A and to understand the conditions under which F remains invertible. Questions arise about the correct approach to differentiate and manipulate these matrix equations.

Discussion Status

Some participants are exploring the relationships between the determinant and trace, with one participant attempting to compute the determinant of the product FA and comparing it to the expression involving the trace of A. Others are questioning the implications of the invertibility condition and how to argue that F remains invertible if it is invertible at a specific point.

Contextual Notes

Participants note the complexity of differentiating matrices and the potential need for clearer examples or representations of the matrices involved. There is also an acknowledgment of the challenges in aligning the terms from different expressions and ensuring the correctness of computations.

MxwllsPersuasns
Messages
97
Reaction score
0

Homework Statement



Please bear with the length of this post, I'm taking it one step at a time starting with i)

Let A: I → gl(n, R) be a smooth function where I ⊂ R is an interval and gl(n, R) denotes the vector space of all n × n matrices.

(i) If F : I → gl(n, R) satisfies the matrix ODE F' = F A , then det F satisfies the scalar ODE (det F)' = tr A det F. Here tr B denotes the trace (the sum of the diagonal elements) of an n × n matrix B.

(ii) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and for some t0 ∈ I we have F(t0) ∈ GL(n, R), where GL(n, R) denotes the group of invertible matrices, then F : I → GL(n, R).

(iii) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and tr A = 0 then det F(t) is constant in t. In particular, if det F(t0) = 1 then det F(t) = 1 for all t ∈ I.

(iv) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and A: I → SO(n, R) takes values in skew-symmetric matrices, then FT(t)F(t) is constant in t. In particular, if F(t0) ∈ O(n, R) then F : I → O(n, R). The analogous statement holds for F(t0) ∈ SO(n, R).

Homework Equations



Not sure; I guess the (scalar) derivative of a matrix would equal that matrix with derivatives wrt to that scalar of each element of the original matrix would be something to use.

The Attempt at a Solution



So I'm guessing the way this is worded that basically we just need to demonstrate each of these facts. So my attempt for part i) was started by assuming we are taking the derivative of the matrix F with respect to some variable (I used t). I took F and A to each be 2 x 2 matrices and had the standard ƒ11 in the upper left corner for F and following through to α22 in the right hand bottom corner for A. Thus each element of A is of the form αij(t) and each element of F is of the form ƒij(t) with each being parameterized by the variable t and (i,j) ∈ {1,2}.

So where I'm running into issues is that I'm not sure if; 1) I'm supposed to show the first ODE by differentiating F and showing it equals the original matrix multiplied by another (A), in which case I'm stuck since differentiating F only yields the same matrix but every element is differentiated.. it's not really like just multiplying by another matrix. Or if 2) I'm supposed to solve the matrix ODE in an analogous way to the way that a regular ODE (f'(x) = γf(x)) in which case I'm not sure how exactly I do that? Do I just treat them as a system of two equations (right? 1 for each row) in one variable and solve that way?

Any help along this journey of ODE discovery is immensely appreciated!
 
Physics news on Phys.org
MxwllsPersuasns said:

Homework Statement



Please bear with the length of this post, I'm taking it one step at a time starting with i)

Let A: I → gl(n, R) be a smooth function where I ⊂ R is an interval and gl(n, R) denotes the vector space of all n × n matrices.

(i) If F : I → gl(n, R) satisfies the matrix ODE F' = F A , then det F satisfies the scalar ODE (det F)' = tr A det F. Here tr B denotes the trace (the sum of the diagonal elements) of an n × n matrix B.

(ii) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and for some t0 ∈ I we have F(t0) ∈ GL(n, R), where GL(n, R) denotes the group of invertible matrices, then F : I → GL(n, R).

(iii) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and tr A = 0 then det F(t) is constant in t. In particular, if det F(t0) = 1 then det F(t) = 1 for all t ∈ I.

(iv) If F : I → gl(n, R) satisfies the matrix ODE F' = F A and A: I → SO(n, R) takes values in skew-symmetric matrices, then FT(t)F(t) is constant in t. In particular, if F(t0) ∈ O(n, R) then F : I → O(n, R). The analogous statement holds for F(t0) ∈ SO(n, R).

Homework Equations



Not sure; I guess the (scalar) derivative of a matrix would equal that matrix with derivatives wrt to that scalar of each element of the original matrix would be something to use.

The Attempt at a Solution



So I'm guessing the way this is worded that basically we just need to demonstrate each of these facts. So my attempt for part i) was started by assuming we are taking the derivative of the matrix F with respect to some variable (I used t). I took F and A to each be 2 x 2 matrices and had the standard ƒ11 in the upper left corner for F and following through to α22 in the right hand bottom corner for A. Thus each element of A is of the form αij(t) and each element of F is of the form ƒij(t) with each being parameterized by the variable t and (i,j) ∈ {1,2}.
Actually seeing the matrices involved would be helpful. In the 2x2 case you have this for the matrix F(t):
##F(t) = \begin{bmatrix}f_{11}(t) & f_{12}(t) \\ f_{21}(t) & f_{22}(t) \end{bmatrix}##
and similarly for A(t):
##A(t) = \begin{bmatrix}a_{11}(t) & a_{12}(t) \\ a_{21}(t) & a_{22}(t) \end{bmatrix}##
For part i), you're given that F satisfies the equation F' = F A. Can you show that |F|' equals tr(A) |F|?
MxwllsPersuasns said:
So where I'm running into issues is that I'm not sure if; 1) I'm supposed to show the first ODE by differentiating F and showing it equals the original matrix multiplied by another (A), in which case I'm stuck since differentiating F only yields the same matrix but every element is differentiated.. it's not really like just multiplying by another matrix. Or if 2) I'm supposed to solve the matrix ODE in an analogous way to the way that a regular ODE (f'(x) = γf(x)) in which case I'm not sure how exactly I do that? Do I just treat them as a system of two equations (right? 1 for each row) in one variable and solve that way?

Any help along this journey of ODE discovery is immensely appreciated!
 
Okay so basically what I did was multiply F and A (I won't write out the matrix here) then take Det(FA) (since that's det(F)') and I ended up getting (after some cancellations) something like (and I won't use subscripts here as it would take too long so please forgive me)

Det(FA) = {f11*α11*f22*α22 + f12*α21*f21*α12} - {f11*α12*f22*α21 + f12*α22*f21*α11}

and so then I try calculating det(F)*tr(A)...

|F|*tr(A) = {f11*f22 - f12*f21}*(α11 + α22} but here I can already see that I'll get terms with only 3 things being multiplied (rather than four per term in the Det(FA)) I'm not sure if maybe I did something wrong with my computations (I doubled-checked and feel confident) or perhaps I'm missing some cancellation somewhere but I just don't see how to get these two things to equal one another. Any suggestions?
 
Also for number ii) is this basically saying that if there's a point t_0 for which F belongs to the group of invertible matrices (and thus is invertible at that point) and it satisfies F' = FA then F is an invertible matrix? I'm not sure how I could argue that. Or if I'm totally off base here can someone please assist?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
13K
Replies
1
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
15K