Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Convolution integral with e^At

  1. Apr 6, 2014 #1
    Hi All,

    I am taking Dynamic Systems and Controls this semester for Mechanical Engineering. We are solving non homogeneous state space equations right now. This question is about a 2x2 state space differential equation that takes the form:

    8dB1Lxa.png

    Where A and B are matrices, while u is an input (like sin(t), etc). We have written this to generally be:

    R7dFLwG.png

    The situation I am wondering about, is when e^At is only real numbers (we use the laplace transform to solve this normally, so no sine or cosine), which means it is of the form:

    04a2P3e.png

    Our professor agrees that C1*C2 = 0 is always true when it is of this form, which simplifies the convolution integral significantly. Additionally, in homeworks and lecture notes I noticed that C1*C1 = C1 and C2*C2=C2. I asked the Professor if this is always true, and he commented that he's not sure if it is always but that it was for the examples we did. What do you guys think, is that always true? If so, that simplifies the convolution integral even further. I am an engineer, not a mathematician, so my memory of DiffEq and proofs isn't so good, but I thought this was an interesting question
     
  2. jcsd
  3. Apr 9, 2014 #2
    It's true provided [itex]\mathbf{A}[/itex] is a real symmetric matrix ([itex]\mathbf{A} = \mathbf{A}^T[/itex]), or more generally a Hermitian matrix. Ideally you should understand at least some of why that is. If [itex]\mathbf{A}[/itex] is a real symmetric matrix, then it has an orthogonal basis of eigenvectors, and we may write
    [tex]\mathbf{A} = \beta_1vv^T + \beta_2ww^T [/tex]
    Where [itex]v[/itex] and [itex]w[/itex] are vectors such that

    [itex]v^Tv = w ^T w = 1[/itex]
    [itex]v^Tw = w^Tv = 0[/itex]

    You should verify that this means [itex]\mathbf{A}v = \beta_1 v [/itex] and [itex]\mathbf{A}w = \beta_2 w [/itex]. Furthermore, due to the properties above,

    [itex]\mathbf{A}^2 = \beta_1^2vv^T + \beta_2^2ww^T [/itex]
    [itex]\mathbf{A}^n = \beta_1^nvv^T + \beta_2^nww^T [/itex]

    Now
    [tex]e^{\mathbf{A}t} = \sum^\infty_{n = 0} \frac{\mathbf{A}^nt^n}{n!}= 1 + \mathbf{A}t + \frac{1}{2}\mathbf{A}\mathbf{A}t^2 + ...[/tex]
    So using our formula from above,
    \begin{align*} e^{\mathbf{A}t} &= \sum^\infty_{n = 0} \frac{t^n}{n!}\beta_1^nvv^T + \frac{t^n}{n!}\beta_2^nww^T \\
    &= vv^T\, e^{\beta_1 t} + ww^T \, e^{\beta_2 t} \end{align*}

    So, [itex]\mathbf{C}_1 = vv^T[/itex] and [itex]\mathbf{C}_2 = ww^T[/itex], and indeed
    \begin{align*}\mathbf{C}_1\mathbf{C}_2 &= v(v^Tw)w^T = v0w^T = 0 \\
    \mathbf{C}_1\mathbf{C}_1 &= v(v^Tv)v^T = v(1)v^T = \mathbf{C}_1 \\
    \mathbf{C}_2\mathbf{C}_2 &= w(w^Tw)w^T = w(1)w^T = \mathbf{C}_2 \\
    \end{align*}
    But, remember we had to assume that [itex]\mathbf{A}[/itex] had a complete set of orthonormal eigenvectors, [itex]\mathbf{A} = \beta_1vv^T + \beta_2ww^T [/itex]. This is true for a real symmetric matrix, which means [itex]\mathbf{A} = \mathbf{A}^T[/itex]. For other kinds of matrices this may not hold. You will not even generally be able to write [itex]e^{\mathbf{A}t} = C_1 e^{\beta_1t} + C_2 e^{\beta_2t}[/itex] for other kinds of matrices.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook