Undergrad How to do Magnus expansion for time-dependent companion matrix?

  • Thread starter Thread starter adf89812
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
The discussion focuses on solving a system of time-dependent first-order ordinary differential equations (ODEs) using Magnus expansion and related methods. It highlights that traditional methods like diagonalization become complex due to time-dependent eigenvalues and eigenvectors. The conversation emphasizes that for n-dimensional vectors and matrices, the solution involves an ordered-exponential, which is necessary when A(t) is not constant. Participants also mention that power series can be used, but this approach may lead to complicated recurrence relations. Ultimately, the consensus is that numerical methods may be required for practical solutions involving the ordered exponential.
adf89812
Messages
37
Reaction score
1
TL;DR
Magnus expansion of companion matrix time-dependent or solve by substitution how?
For example, consider the following system of 2 first order ODEs:
$$
\left\{\begin{array}{l}
x_1^{\prime}=2 t x_1+t^2 x_2 \\
x_2^{\prime}=t^3 x_1+4 t x_2
\end{array}\right.
$$

This is a linear homogeneous system of 2 first order ODEs with $$A(t)=\left[\begin{array}{ll}2 t & t^2 \\ t^3 & 4 t\end{array}\right]$$.


"Secondly, the substitution method works in the same manner as usual. Indeed, the first line of the system leads to $$y = t^{-2}\dot{x} + 2t^{-1}x$$, which can be differentiated in order to find $\dot{y}$ in terms of $$x$$ and $$t$$. Next, plugging these expressions into the second line, you will end up with a second-order linear ODE with non-constant coefficients for $$x$$, which itself might not be easy to solve in the present case."

"
Firstly, you mentioned diagonalization; however, in that case, the eigenvalues and the eigenvectors will be themselves time-dependent. If $$S$$ denotes the change of basis allowing the diagonalization of $$A$$ as $$D$$, i.e. $$ D= SAS^{-1}$$, then the system of equations $$\dot{u} = Au$$, where $$u = (x,y)$$, becomes
$$
\dot{v} = \partial_t(Su) = S\dot{u} + \dot{S}u = \left(SA + \dot{S}\right)u = \left(SAS^{-1} + \dot{S}S^{-1}\right)Su = \left(D + \dot{S}S^{-1}\right)v
$$
for $$v = Su$$. This new system might be even harder to solve yet, because of the extra (non-diagonal) term $$\dot{S}S^{-1}$$."
 
Physics news on Phys.org
Please let me know
x’=\dot{x}=\frac{dx}{dt}?
 
Last edited:
So is it basically ##\dot{x}(t)=A(t)x(t)##?
One sort of solution is by ##x(t)=e^{\int^t A(s)ds}x(0)##.

I think so, am I wrong?
 
billtodd said:
I think so, am I wrong?
You are correct only if ##x,A## are scalars (1-dimensional). Otherwise, if ##x(t)## is an n-dimensional vector and ##A(t)## is a ##n\times n## matrix then the correct solution involves the ordered-exponential involving nested integrals over products of ##A(t)##. This will reduce to your usual exponential for special matrices that commute at different times, like a constant matrix.
 
renormalize said:
You are correct only if ##x,A## are scalars (1-dimensional). Otherwise, if ##x(t)## is an n-dimensional vector and ##A(t)## is a ##n\times n## matrix then the correct solution involves the ordered-exponential involving nested integrals over products of ##A(t)##. This will reduce to your usual exponential for special matrices that commute at different times, like a constant matrix.
I was refering to the vector case.
Obviously using ##e^A=\sum_{n=0}^\infty A^n/n!##.
And here (in the OP) you first integrate ##A##, and then plug it to the sum. Though I am not sure if there's a closed form solution here.
 
billtodd said:
I was refering to the vector case.
Obviously using ##e^A=\sum_{n=0}^\infty A^n/n!##.
And here (in the OP) you first integrate ##A##, and then plug it to the sum. Though I am not sure if there's a closed form solution here.
Sorry, that doesn't work to solve your vector ODE for a general matrix ##A(t)##. Just try it out: start from your proposed "solution" ##x\left(t\right)=\exp\left[\intop_{0}^{t}A\left(t^{\prime}\right)dt^{\prime}\right]x\left(0\right)## with the exponential expanded as a series. Now differentiate the series term-by-term to see if ##x(t)## satisfies ##\dot{x}\left(t\right)=A\left(t\right)x\left(t\right)##. (Hint: it won't unless ##A(t_1)## commutes with ##A(t_2)## for all pairs ##t_1,t_2## (i.e., it's a constant matrix) so that ##A## can be moved completely to the left of the exponential.
 
Last edited:
Hi @billtodd. If you write ##x(t) = M(t) x(0)## then ##\dot{x} (t) = \dot{M} (t) x(0)= A(t) x(t) = A(t) M(t)x(0)##. Implying you need to solve:

\begin{align*}
\frac{d}{dt} M(t) = A(t) M(t)
\end{align*}

subject to ##M(0)= \mathbb{1}##. I derived the formal solution to this for the general case of a time dependent matrix ##A(t)## here:

https://www.physicsforums.com/threa...unction-valued-matrices.1046714/#post-6814958

and it involves the time-ordered product of matrices ##A(t_1),A(t_2), \dots##. This solution reduces to ##M(t) = \exp (\int_0^t A (t') dt')## in the case where ##A(t)## is a constant matrix.
 
renormalize said:
Sorry, that doesn't work to solve your vector ODE for a general matrix ##A(t)##. Just try it out: start from your proposed "solution" ##x\left(t\right)=\exp\left[\intop_{0}^{t}A\left(t^{\prime}\right)dt^{\prime}\right]x\left(0\right)## with the exponential expanded as a series. Now differentiate the series term-by-term to see if ##x(t)## satisfies ##\dot{x}\left(t\right)=A\left(t\right)x\left(t\right)##. (Hint: it won't unless ##A(t_1)## commutes with ##A(t_2)## for all pairs ##t_1,t_2## (i.e., it's a constant matrix) so that ##A## can be moved completely to the left of the exponential.
So how would you solve it?
One can try power series for both ##x_1(t),x_2(t)##. But then you translate the differential equations to recurrence equations, which doesn't necessarily make the problem any easier.
OK, now I understand what is that ordered exponential.
First time I had seen it was in QM2.
https://en.wikipedia.org/wiki/Dyson_series
But one can only solve this numerically with the ordered exponential.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
0
Views
2K
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
7
Views
3K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 65 ·
3
Replies
65
Views
7K