How can diagonal matrices help solve eigenvalue problems?

  • Context: Graduate 
  • Thread starter Thread starter soulflyfgm
  • Start date Start date
  • Tags Tags
    Eigenvalues Laplace
Click For Summary

Discussion Overview

The discussion revolves around the use of diagonal matrices in solving eigenvalue problems, particularly focusing on the computation of the matrix exponential \( e^{At} \) for a diagonalizable matrix \( A \). Participants explore various methods and hints to demonstrate the relationship between the matrix exponential and the identity matrix through eigenvalues and diagonalization.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant requests hints on how to solve a problem related to the matrix exponential and eigenvalues, referencing a specific equation involving \( e^{At} \).
  • Another participant suggests that if \( A \) is diagonalizable, \( e^{At} \) can be computed by focusing on the diagonal entries after diagonalization, mentioning the Laplace Transform.
  • A different participant proposes evaluating the equation at \( t=0 \) to show that the sum of the \( Z_k \) terms equals the identity matrix, indicating a possible simplification.
  • Another participant provides a general example of how the sum of diagonal matrices leads to the identity matrix, suggesting that this can be extended to other values of \( t \).
  • One participant mentions having proven a related equation but expresses uncertainty about how to connect it to the sum of the \( Z_k \) terms.
  • A later reply illustrates the case of a diagonal matrix and explicitly computes \( e^{At} \) for a \( 2 \times 2 \) matrix, reinforcing the relationship between the diagonal form and the identity matrix.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the best approach to prove the connection between the matrix exponential and the identity matrix, with multiple methods and interpretations being discussed.

Contextual Notes

Some participants' arguments depend on the assumption that \( A \) is diagonalizable, and there are unresolved steps in the mathematical reasoning presented.

soulflyfgm
Messages
28
Reaction score
0
hi, can some one give me any hints how to solve this problem? thank you


i tried to type it here but it dint come up so i uploaded http://tinypic.com/view.php?pic=2hgtqoz&s=3" with the problem.

Thank you so much


Recall that for an nxn matrix A with distinct eigenvalues [tex]\lambda[/tex] [tex]_{F}[/ktex], k=1,2,...,n<br /> <br /> <br /> e^{At} = \sum^{n}_{k=1} Z_{k}e^{\lambda_{k}t}<br /> <br /> By taking the Laplace Transform of both sides (or otherwise) show that<br /> \sum^{n}_{k=1}Z_{k}= I_{n}<br /> <br /> Where I_{n} is the nxn identity matrix[/tex]
 
Last edited by a moderator:
Physics news on Phys.org
Although stated in a very horrible way, what is asked here boils down to show that, if A is diagonalizable, one can compute [itex]e^{At}[/itex] by only computing the diagonal entries after diagonalizing. Keep in mind that [itex]\mathcal{L}(e^{At}) = (sI - A)^{-1}[/itex].

So once you have the diagonal form employ the inverse laplace transformation.

I am sure that you can do it, so think about it for a while...
 
okay maybe i read it totally wrong but if you know that

[tex]e^{At} = \sum^{n}_{k=1} Z_{k}e^{\lambda_{k}t}[/tex]

for all t then use it fot t=0 and get

[tex]I = e^{0} = e^{A0} = \sum^{n}_{k=1} Z_{k}e^{\lambda_{k}0} = \sum^{n}_{k=1} Z_{k}[/tex]

and you are done?
 
The sum works with a general form of the following
[tex] \left[ {\begin{array}{*{20}c}<br /> 1 & 0 \\<br /> 0 & 0 \\<br /> \end{array}} \right] + \left[ {\begin{array}{*{20}c}<br /> 0 & 0 \\<br /> 0 & 1 \\<br /> \end{array}} \right] = I[/tex]

So it is directly related with the diagonal form... Thus you can also prove for non-zero t
 
ok i have proven this so far
(SI-A)L(e^(At)) = Identity...but i do not how to prove that this is equal to [tex]\sum^{n}_{k=1}Z_{k}[/tex]

Any hints? thank you so much
 
OK, but that one tells you nothing, how about this? Let A is a diagonal matrix, then for 2x2 case,

[tex] e^{At}= e^{\left[ {\begin{array}{*{20}c}<br /> \alpha & 0 \\<br /> 0 & \beta \\<br /> \end{array}} \right]t} = \left[ {\begin{array}{*{20}c}<br /> e^{\alpha t} & 0 \\<br /> 0 & e^{\beta t} \\<br /> \end{array}} \right] = \left[ {\begin{array}{*{20}c}<br /> 1 & 0 \\<br /> 0 & 0 \\<br /> \end{array}} \right]e^{\alpha t} + \left[ {\begin{array}{*{20}c}<br /> 0 & 0 \\<br /> 0 & 1 \\<br /> \end{array}} \right]e^{\beta t}[/tex]
And, then from the previous post,
[tex] <br /> \left[ {\begin{array}{*{20}c}<br /> 1 & 0 \\<br /> 0 & 0 \\<br /> \end{array}} \right] + \left[ {\begin{array}{*{20}c}<br /> 0 & 0 \\<br /> 0 & 1 \\<br /> \end{array}} \right] = I<br /> [/tex]

Can you see the pattern now?
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K