kostoglotov
- 231
- 6
So, in a section on applying Eigenvectors to Differential Equations (what a jump in the learning curve), I've encountered
e^{At} \vec{u}(0) = \vec{u}(t)
as a solution to certain differential equations, if we are considering the trial substitution y = e^{\lambda t} and solving for constant coefficients. I understand the idea more or less, but the math gets quite involved, particularly with complex eigenvalues and eigenvectors.
I can see how (or at least I can follow and accept the explanation for) e^{At} being an infinite series.
But how does e^{\Lambda t} = a single matrix with e^{\lambda_i t} on it's diagonal?
e^{At} \vec{u}(0) = \vec{u}(t)
as a solution to certain differential equations, if we are considering the trial substitution y = e^{\lambda t} and solving for constant coefficients. I understand the idea more or less, but the math gets quite involved, particularly with complex eigenvalues and eigenvectors.
I can see how (or at least I can follow and accept the explanation for) e^{At} being an infinite series.
But how does e^{\Lambda t} = a single matrix with e^{\lambda_i t} on it's diagonal?