Generalized Eigenvector Solutions for a System of Differential Equations

STEMucator
Homework Helper
Messages
2,076
Reaction score
140

Homework Statement



Solve the system:

##x' = 5x - y##
##y' = 4x + y##

Homework Equations



##t## is transpose.

The Attempt at a Solution



I'm a bit rusty with these and I had a small question.

I put the system into the form ##x' = Ax## and proceeded to solve for the eigenvalues. I found that ##\lambda_1 = 3## was the only eigenvalue of multiplicity 2.

I then solved for the eigenspace ##ε_A(\lambda_1) = null(A - \lambda_1I)## and found that the only eigenvector in the span was ##v_1 = [1, 2]^t##.

This yielded my first independent solution ##x_1(t) = e^{3t} [1, 2]^t##.

I need a second independent solution now of the form ##x_2(t) = e^{3t}(tv_1 + u_1)## where ##u_1## is a solution of the system ##(A - \lambda_1I) = v_1##.

Super easy system to solve as it required only two row operations and it left me with:

[2 -1 | 1]
[0 0 | 0]

Hence ##u_1 = span\{ [1/2, 0]^t, [1/2, 1]^t \}## where I've noticed the second vector in the span is simply ##\frac{1}{2}v_1##.

Now my problem. The book lists ##u_1 = [0, -1]^t## as the generalized eigenvector they have used to obtain ##x_2(t)##. I understand how they have obtained this solution, but I'm wondering if my solution is also correct. I would wind up using ##[1/2, 0]^t## as my generalized eigenvector or any multiple of it. Preferably I would use ##[1, 0]^t##.

Then of course ##X(t) = (x_1(t), x_2(t))## and ##X(t)C## is the general solution.
 
Physics news on Phys.org
Zondrina said:

Homework Statement



Solve the system:

##x' = 5x - y##
##y' = 4x + y##

Homework Equations



##t## is transpose.

The Attempt at a Solution



I'm a bit rusty with these and I had a small question.

I put the system into the form ##x' = Ax## and proceeded to solve for the eigenvalues. I found that ##\lambda_1 = 3## was the only eigenvalue of multiplicity 2.

I then solved for the eigenspace ##ε_A(\lambda_1) = null(A - \lambda_1I)## and found that the only eigenvector in the span was ##v_1 = [1, 2]^t##.

This yielded my first independent solution ##x_1(t) = e^{3t} [1, 2]^t##.

I need a second independent solution now of the form ##x_2(t) = e^{3t}(tv_1 + u_1)## where ##u_1## is a solution of the system ##(A - \lambda_1I) = v_1##.

Super easy system to solve as it required only two row operations and it left me with:

[2 -1 | 1]
[0 0 | 0]

Hence ##u_1 = span\{ [1/2, 0]^t, [1/2, 1]^t \}## where I've noticed the second vector in the span is simply ##\frac{1}{2}v_1##.

Now my problem. The book lists ##u_1 = [0, -1]^t## as the generalized eigenvector they have used to obtain ##x_2(t)##. I understand how they have obtained this solution, but I'm wondering if my solution is also correct. I would wind up using ##[1/2, 0]^t## as my generalized eigenvector or any multiple of it. Preferably I would use ##[1, 0]^t##.

Then of course ##X(t) = (x_1(t), x_2(t))## and ##X(t)C## is the general solution.

If you write the system as ##v' = Av##, where
v = \pmatrix{x\\y}, \:\text{ and }\: A = \pmatrix{5&-1\\4&1}
then the solution is
v = v_0 e^{At}

I don't know if you have yet studied the Jordan Canonical Form, but in this case it is
J = \pmatrix{3&1\\0&3}
so there is an invertible matrix ##P## such that ##A = P J P^{-1}##, hence
e^{At} \equiv I + \sum_{n=1}^{\infty} \frac{1}{n!} (At)^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} A^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} P J^n P^{-1}<br /> = P e^{Jt} P^{-1}
The matrix ##e^{Jt}## is easy to get:
e^{tJ} = \pmatrix{e^{3t} &amp; t e^{3t}\\0 &amp; e^{3t}}
Therefore, the general homogeneous solution is of the form
x = a\, e^{3t} + b\, t e^{3t} \\<br /> y = c\, e^{3t} +h \, t e^{3t}
for some constants ##a,b,c,h##.

Note: the Jordan form just comes from the generalized eigenvalue problem: if ##u_1## is a generalized eigenvector---so that for eigenvalue ##r## we have ##(A - rI)^2 u_1 = 0##---then setting ##(A - rI)u_1 = u_2## we see that ##u_2## is an eigenvector and that ##Au_1 = r u_1 + u_2##. Together with ##A u_2 = r u_2## we see that the matrix ##A## expressed in the basis ##{u_1, u_2}## is the Jordan form ##J##.
 
  • Like
Likes 1 person
Ray Vickson said:
If you write the system as ##v' = Av##, where
v = \pmatrix{x\\y}, \:\text{ and }\: A = \pmatrix{5&amp;-1\\4&amp;1}
then the solution is
v = v_0 e^{At}

I don't know if you have yet studied the Jordan Canonical Form, but in this case it is
J = \pmatrix{3&amp;1\\0&amp;3}
so there is an invertible matrix ##P## such that ##A = P J P^{-1}##, hence
e^{At} \equiv I + \sum_{n=1}^{\infty} \frac{1}{n!} (At)^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} A^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} P J^n P^{-1}<br /> = P e^{Jt} P^{-1}
The matrix ##e^{Jt}## is easy to get:
e^{tJ} = \pmatrix{e^{3t} &amp; t e^{3t}\\0 &amp; e^{3t}}
Therefore, the general homogeneous solution is of the form
x = a\, e^{3t} + b\, t e^{3t} \\<br /> y = c\, e^{3t} +h \, t e^{3t}
for some constants ##a,b,c,h##.

Note: the Jordan form just comes from the generalized eigenvalue problem: if ##u_1## is a generalized eigenvector---so that for eigenvalue ##r## we have ##(A - rI)^2 u_1 = 0##---then setting ##(A - rI)u_1 = u_2## we see that ##u_2## is an eigenvector and that ##Au_1 = r u_1 + u_2##. Together with ##A u_2 = r u_2## we see that the matrix ##A## expressed in the basis ##{u_1, u_2}## is the Jordan form ##J##.

I haven't seen this, but some wiki research has informed me of how this works. It's quite an interesting formulation actually.

I'm going to go play around with this for awhile now, thank you.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top