MHB How can we find the coefficients?

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Coefficients
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

We have the initial value problem $$u'(t)=Au(t) \ \ , \ \ 0 \leq t \leq T \\ u(0)=u^0 \\ u \in \mathbb{R}^m$$ A is a $m \times m$ matrix

The eigenvalues of $A$ are $\lambda_j$ and the corresponding eigenvectors are $\phi^{(j)}$.

The general solution of initial value problem is $$u(t)=\sum_{j=1}^m c_j e^{\lambda_jt}\phi^{(j)}$$

right??

For $t=0$ we have $$u^0=\sum_{j=1}^m c_j \phi^{(j)}$$ How can we solve for $c_j$ ?? (Wondering)

Do we maybe have to use a dot product?? (Wondering)
 
Mathematics news on Phys.org
mathmari said:
Hey! :o

We have the initial value problem $$u'(t)=Au(t) \ \ , \ \ 0 \leq t \leq T \\ u(0)=u^0 \\ u \in \mathbb{R}^m$$ A is a $m \times m$ matrix

The eigenvalues of $A$ are $\lambda_j$ and the corresponding eigenvectors are $\phi^{(j)}$.

The general solution of initial value problem is $$u(t)=\sum_{j=1}^m c_j e^{\lambda_jt}\phi^{(j)}$$

right??

For $t=0$ we have $$u^0=\sum_{j=1}^m c_j \phi^{(j)}$$ How can we solve for $c_j$ ?? (Wondering)

Do we maybe have to use a dot product?? (Wondering)

Hi! (Wave)

Let's make that:
$$u^0=\sum_{j=1}^m c_j \phi^{(j)} = \Big(\phi^{(j)}\Big) \begin{bmatrix}c_1\\c_2\\\vdots\\c_n\end{bmatrix}$$
See how we can solve it for $c_j$? (Wondering)
 
Last edited:
I like Serena said:
Let's make that:
$$u^0=\sum_{j=1}^m c_j \phi^{(j)}(0) = \Big(\phi^{(j)}(0)\Big) \begin{bmatrix}c_1\\c_2\\\vdots\\c_n\end{bmatrix}$$
See how we can solve it for $c_j$? (Wondering)

Are the eigenvectors $\phi^{(j)}$ a function of $t$?? (Wondering) Because you write $\phi^{(j)}(0)$.

$\Big (\phi^{(j)}(0)\Big )$ is a matrix, isn't it?? (Wondering) So, we have to find the inverse, or not??
 
mathmari said:
Are the eigenvectors $\phi^{(j)}$ a function of $t$?? (Wondering) Because you write $\phi^{(j)}(0)$.

No I didn't! (Blush)

$\phi^{(j)}$ is a matrix, isn't it?? (Wondering) So, we have to find the inverse, or not??

Yep. (Nod)
 
I like Serena said:
Yep. (Nod)

So, $$u^0\Big (\phi^{(j)}\Big )^{-1}=\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$$ Is this correct?? (Wondering)

Now we have the vector $\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$. How can we write the formula for $c_j$ ?? (Wondering)

I found in my book the following solution $$u(t)=\sum_{j=1}^m e^{\lambda t}(u(0), \phi^{(j)})\phi^{(j)}$$ where $(\cdot , \cdot)$ is the euclidean dot product.
But how did we find that?? (Wondering)
 
mathmari said:
So, $$u^0\Big (\phi^{(j)}\Big )^{-1}=\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$$ Is this correct?? (Wondering)

The product is not commutative, so that should be
$$\Big (\phi^{(j)}\Big )^{-1} u^0=\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$$
Now we have the vector $\begin{bmatrix}
c_1\\
c_2\\
\cdot\\
\cdot\\
\cdot \\
c_m
\end{bmatrix}$. How can we write the formula for $c_j$ ?? (Wondering)

That is a formula for $c_j$. To simplify it, we'd need more information, like $A$ being symmetric. (Wasntme)

I found in my book the following solution $$u(t)=\sum_{j=1}^m e^{\lambda t}(u(0), \phi^{(j)})\phi^{(j)}$$ where $(\cdot , \cdot)$ is the euclidean dot product.
But how did we find that?? (Wondering)

Looks there is an assumption in there that the eigenvectors are orthonormal.
I think that is only possible if the matrix $A$ is symmetric, but that does not seem to be given - or is it? (Wondering)
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.
Back
Top