MHB How can we find the coefficients?

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Coefficients
Click For Summary
The discussion centers on solving for the coefficients \( c_j \) in the general solution of the initial value problem \( u'(t) = Au(t) \) with given eigenvalues and eigenvectors of matrix \( A \). The initial condition leads to the equation \( u^0 = \sum_{j=1}^m c_j \phi^{(j)} \), prompting questions about using the inverse of the eigenvector matrix to isolate \( c_j \). Participants explore whether the eigenvectors are functions of time and the implications of orthonormality, particularly in relation to the symmetry of matrix \( A \). The conversation concludes with the acknowledgment that further simplification of the formula for \( c_j \ may depend on additional properties of \( A \).
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

We have the initial value problem $$u'(t)=Au(t) \ \ , \ \ 0 \leq t \leq T \\ u(0)=u^0 \\ u \in \mathbb{R}^m$$ A is a $m \times m$ matrix

The eigenvalues of $A$ are $\lambda_j$ and the corresponding eigenvectors are $\phi^{(j)}$.

The general solution of initial value problem is $$u(t)=\sum_{j=1}^m c_j e^{\lambda_jt}\phi^{(j)}$$

right??

For $t=0$ we have $$u^0=\sum_{j=1}^m c_j \phi^{(j)}$$ How can we solve for $c_j$ ?? (Wondering)

Do we maybe have to use a dot product?? (Wondering)
 
Mathematics news on Phys.org
mathmari said:
Hey! :o

We have the initial value problem $$u'(t)=Au(t) \ \ , \ \ 0 \leq t \leq T \\ u(0)=u^0 \\ u \in \mathbb{R}^m$$ A is a $m \times m$ matrix

The eigenvalues of $A$ are $\lambda_j$ and the corresponding eigenvectors are $\phi^{(j)}$.

The general solution of initial value problem is $$u(t)=\sum_{j=1}^m c_j e^{\lambda_jt}\phi^{(j)}$$

right??

For $t=0$ we have $$u^0=\sum_{j=1}^m c_j \phi^{(j)}$$ How can we solve for $c_j$ ?? (Wondering)

Do we maybe have to use a dot product?? (Wondering)

Hi! (Wave)

Let's make that:
$$u^0=\sum_{j=1}^m c_j \phi^{(j)} = \Big(\phi^{(j)}\Big) \begin{bmatrix}c_1\\c_2\\\vdots\\c_n\end{bmatrix}$$
See how we can solve it for $c_j$? (Wondering)
 
Last edited:
I like Serena said:
Let's make that:
$$u^0=\sum_{j=1}^m c_j \phi^{(j)}(0) = \Big(\phi^{(j)}(0)\Big) \begin{bmatrix}c_1\\c_2\\\vdots\\c_n\end{bmatrix}$$
See how we can solve it for $c_j$? (Wondering)

Are the eigenvectors $\phi^{(j)}$ a function of $t$?? (Wondering) Because you write $\phi^{(j)}(0)$.

$\Big (\phi^{(j)}(0)\Big )$ is a matrix, isn't it?? (Wondering) So, we have to find the inverse, or not??
 
mathmari said:
Are the eigenvectors $\phi^{(j)}$ a function of $t$?? (Wondering) Because you write $\phi^{(j)}(0)$.

No I didn't! (Blush)

$\phi^{(j)}$ is a matrix, isn't it?? (Wondering) So, we have to find the inverse, or not??

Yep. (Nod)
 
I like Serena said:
Yep. (Nod)

So, $$u^0\Big (\phi^{(j)}\Big )^{-1}=\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$$ Is this correct?? (Wondering)

Now we have the vector $\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$. How can we write the formula for $c_j$ ?? (Wondering)

I found in my book the following solution $$u(t)=\sum_{j=1}^m e^{\lambda t}(u(0), \phi^{(j)})\phi^{(j)}$$ where $(\cdot , \cdot)$ is the euclidean dot product.
But how did we find that?? (Wondering)
 
mathmari said:
So, $$u^0\Big (\phi^{(j)}\Big )^{-1}=\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$$ Is this correct?? (Wondering)

The product is not commutative, so that should be
$$\Big (\phi^{(j)}\Big )^{-1} u^0=\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$$
Now we have the vector $\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$. How can we write the formula for $c_j$ ?? (Wondering)

That is a formula for $c_j$. To simplify it, we'd need more information, like $A$ being symmetric. (Wasntme)

I found in my book the following solution $$u(t)=\sum_{j=1}^m e^{\lambda t}(u(0), \phi^{(j)})\phi^{(j)}$$ where $(\cdot , \cdot)$ is the euclidean dot product.
But how did we find that?? (Wondering)

Looks there is an assumption in there that the eigenvectors are orthonormal.
I think that is only possible if the matrix $A$ is symmetric, but that does not seem to be given - or is it? (Wondering)
 
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
12
Views
3K