Generalized Eigenvector Solutions for a System of Differential Equations

Click For Summary
SUMMARY

The discussion centers on solving the system of differential equations represented by the matrix equation ##x' = Ax##, where ##A = \begin{pmatrix} 5 & -1 \\ 4 & 1 \end{pmatrix}##. The eigenvalue ##\lambda_1 = 3##, with multiplicity 2, leads to the eigenvector ##v_1 = \begin{pmatrix} 1 \\ 2 \end{pmatrix}##. The participant seeks clarification on the generalized eigenvector, noting that the book lists ##u_1 = \begin{pmatrix} 0 \\ -1 \end{pmatrix}##, while they propose using ##u_1 = \begin{pmatrix} 1/2 \\ 0 \end{pmatrix}## or its multiples. The general solution is expressed as ##X(t) = (x_1(t), x_2(t))##, incorporating the Jordan Canonical Form for further analysis.

PREREQUISITES
  • Understanding of eigenvalues and eigenvectors in linear algebra
  • Familiarity with systems of differential equations
  • Knowledge of the Jordan Canonical Form
  • Proficiency in matrix exponentiation and its applications
NEXT STEPS
  • Study the derivation and application of the Jordan Canonical Form in differential equations
  • Learn about matrix exponentiation techniques for solving linear systems
  • Explore generalized eigenvectors and their role in finding solutions to differential equations
  • Investigate the implications of eigenvalue multiplicity on the solution structure
USEFUL FOR

Mathematicians, engineering students, and anyone involved in solving systems of differential equations, particularly those interested in advanced linear algebra concepts.

STEMucator
Homework Helper
Messages
2,076
Reaction score
140

Homework Statement



Solve the system:

##x' = 5x - y##
##y' = 4x + y##

Homework Equations



##t## is transpose.

The Attempt at a Solution



I'm a bit rusty with these and I had a small question.

I put the system into the form ##x' = Ax## and proceeded to solve for the eigenvalues. I found that ##\lambda_1 = 3## was the only eigenvalue of multiplicity 2.

I then solved for the eigenspace ##ε_A(\lambda_1) = null(A - \lambda_1I)## and found that the only eigenvector in the span was ##v_1 = [1, 2]^t##.

This yielded my first independent solution ##x_1(t) = e^{3t} [1, 2]^t##.

I need a second independent solution now of the form ##x_2(t) = e^{3t}(tv_1 + u_1)## where ##u_1## is a solution of the system ##(A - \lambda_1I) = v_1##.

Super easy system to solve as it required only two row operations and it left me with:

[2 -1 | 1]
[0 0 | 0]

Hence ##u_1 = span\{ [1/2, 0]^t, [1/2, 1]^t \}## where I've noticed the second vector in the span is simply ##\frac{1}{2}v_1##.

Now my problem. The book lists ##u_1 = [0, -1]^t## as the generalized eigenvector they have used to obtain ##x_2(t)##. I understand how they have obtained this solution, but I'm wondering if my solution is also correct. I would wind up using ##[1/2, 0]^t## as my generalized eigenvector or any multiple of it. Preferably I would use ##[1, 0]^t##.

Then of course ##X(t) = (x_1(t), x_2(t))## and ##X(t)C## is the general solution.
 
Physics news on Phys.org
Zondrina said:

Homework Statement



Solve the system:

##x' = 5x - y##
##y' = 4x + y##

Homework Equations



##t## is transpose.

The Attempt at a Solution



I'm a bit rusty with these and I had a small question.

I put the system into the form ##x' = Ax## and proceeded to solve for the eigenvalues. I found that ##\lambda_1 = 3## was the only eigenvalue of multiplicity 2.

I then solved for the eigenspace ##ε_A(\lambda_1) = null(A - \lambda_1I)## and found that the only eigenvector in the span was ##v_1 = [1, 2]^t##.

This yielded my first independent solution ##x_1(t) = e^{3t} [1, 2]^t##.

I need a second independent solution now of the form ##x_2(t) = e^{3t}(tv_1 + u_1)## where ##u_1## is a solution of the system ##(A - \lambda_1I) = v_1##.

Super easy system to solve as it required only two row operations and it left me with:

[2 -1 | 1]
[0 0 | 0]

Hence ##u_1 = span\{ [1/2, 0]^t, [1/2, 1]^t \}## where I've noticed the second vector in the span is simply ##\frac{1}{2}v_1##.

Now my problem. The book lists ##u_1 = [0, -1]^t## as the generalized eigenvector they have used to obtain ##x_2(t)##. I understand how they have obtained this solution, but I'm wondering if my solution is also correct. I would wind up using ##[1/2, 0]^t## as my generalized eigenvector or any multiple of it. Preferably I would use ##[1, 0]^t##.

Then of course ##X(t) = (x_1(t), x_2(t))## and ##X(t)C## is the general solution.

If you write the system as ##v' = Av##, where
[tex]v = \pmatrix{x\\y}, \:\text{ and }\: A = \pmatrix{5&-1\\4&1}[/tex]
then the solution is
[tex]v = v_0 e^{At}[/tex]

I don't know if you have yet studied the Jordan Canonical Form, but in this case it is
[tex]J = \pmatrix{3&1\\0&3}[/tex]
so there is an invertible matrix ##P## such that ##A = P J P^{-1}##, hence
[tex]e^{At} \equiv I + \sum_{n=1}^{\infty} \frac{1}{n!} (At)^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} A^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} P J^n P^{-1}<br /> = P e^{Jt} P^{-1}[/tex]
The matrix ##e^{Jt}## is easy to get:
[tex]e^{tJ} = \pmatrix{e^{3t} & t e^{3t}\\0 & e^{3t}}[/tex]
Therefore, the general homogeneous solution is of the form
[tex]x = a\, e^{3t} + b\, t e^{3t} \\<br /> y = c\, e^{3t} +h \, t e^{3t}[/tex]
for some constants ##a,b,c,h##.

Note: the Jordan form just comes from the generalized eigenvalue problem: if ##u_1## is a generalized eigenvector---so that for eigenvalue ##r## we have ##(A - rI)^2 u_1 = 0##---then setting ##(A - rI)u_1 = u_2## we see that ##u_2## is an eigenvector and that ##Au_1 = r u_1 + u_2##. Together with ##A u_2 = r u_2## we see that the matrix ##A## expressed in the basis ##{u_1, u_2}## is the Jordan form ##J##.
 
  • Like
Likes   Reactions: 1 person
Ray Vickson said:
If you write the system as ##v' = Av##, where
[tex]v = \pmatrix{x\\y}, \:\text{ and }\: A = \pmatrix{5&-1\\4&1}[/tex]
then the solution is
[tex]v = v_0 e^{At}[/tex]

I don't know if you have yet studied the Jordan Canonical Form, but in this case it is
[tex]J = \pmatrix{3&1\\0&3}[/tex]
so there is an invertible matrix ##P## such that ##A = P J P^{-1}##, hence
[tex]e^{At} \equiv I + \sum_{n=1}^{\infty} \frac{1}{n!} (At)^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} A^n = I + \sum_{n=1}^{\infty} \frac{t^n}{n!} P J^n P^{-1}<br /> = P e^{Jt} P^{-1}[/tex]
The matrix ##e^{Jt}## is easy to get:
[tex]e^{tJ} = \pmatrix{e^{3t} & t e^{3t}\\0 & e^{3t}}[/tex]
Therefore, the general homogeneous solution is of the form
[tex]x = a\, e^{3t} + b\, t e^{3t} \\<br /> y = c\, e^{3t} +h \, t e^{3t}[/tex]
for some constants ##a,b,c,h##.

Note: the Jordan form just comes from the generalized eigenvalue problem: if ##u_1## is a generalized eigenvector---so that for eigenvalue ##r## we have ##(A - rI)^2 u_1 = 0##---then setting ##(A - rI)u_1 = u_2## we see that ##u_2## is an eigenvector and that ##Au_1 = r u_1 + u_2##. Together with ##A u_2 = r u_2## we see that the matrix ##A## expressed in the basis ##{u_1, u_2}## is the Jordan form ##J##.

I haven't seen this, but some wiki research has informed me of how this works. It's quite an interesting formulation actually.

I'm going to go play around with this for awhile now, thank you.
 

Similar threads

Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
5
Views
2K
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
1K