Finding eigenvalues and eigenvectors for a polynomial transformation

  • Thread starter Thread starter trap101
  • Start date Start date
  • Tags Tags
    Eigenvector
Click For Summary
To find the eigenvalues and eigenvectors for the polynomial transformation T defined by T(p)(x) = p'(x) + 2p(x), one must solve the equation p'(x) + 2p(x) = λp(x). This involves treating the problem as a first-order linear differential equation with the parameter λ. A suitable basis for the polynomial space P3(R) should be established, typically using standard basis vectors, to express any cubic polynomial. The transformation T can then be applied to these basis vectors to construct a matrix representation, allowing for the identification of eigenvalues and eigenspaces. Understanding the relationship between the chosen basis and the resulting eigenvectors is crucial, as it ensures consistency across different bases.
trap101
Messages
339
Reaction score
0
Hi,

So for some reason I have the hardest time trying to work with polynomials in linear algebra. I can't explain it, but whenever I see a question I draw a complete blank.

Question: i) Find all the eigenvalues. ii) for each eigenvalue λ, find a basis of the eigenspace Eλ.

T: P3(R) --> P3(R) defined by T(p)(x) = p'(x) + 2p(x)

So this is all I'm given. My question is what polynomials do I use to find the eigenvalues, and once I find those eigenvalues how do I find the eigenvectors? I'm inclined to try and solve it like eigenvector problems with matrices, problem is I don't know how to put this into a matrix.
 
Physics news on Phys.org
trap101 said:
Hi,

So for some reason I have the hardest time trying to work with polynomials in linear algebra. I can't explain it, but whenever I see a question I draw a complete blank.

Question: i) Find all the eigenvalues. ii) for each eigenvalue λ, find a basis of the eigenspace Eλ.

T: P3(R) --> P3(R) defined by T(p)(x) = p'(x) + 2p(x)

So this is all I'm given. My question is what polynomials do I use to find the eigenvalues, and once I find those eigenvalues how do I find the eigenvectors? I'm inclined to try and solve it like eigenvector problems with matrices, problem is I don't know how to put this into a matrix.

You need to solve the problem p'(x) + 2p(x) = \lambda p(x), and the solution must be a polynomial.

RGV
 
Ray Vickson said:
You need to solve the problem p'(x) + 2p(x) = \lambda p(x), and the solution must be a polynomial.

RGV

But what am I solving for? it can't be for lambda. In simple algebra I would have the polynomial and solve for "x". Do I use the standard basis vectors?
 
trap101 said:
But what am I solving for? it can't be for lambda. In simple algebra I would have the polynomial and solve for "x". Do I use the standard basis vectors?

In this case you are solving a first-order linear differential equation that happens to have a parameter, λ, in it. In other words, you need to find the function p(x).

RGV
 
Oh, I should mention that I haven't done differential equations yet at my Uni. Only have Calc courses and Linear Algebra part I.

What I wanted to do was solve it as if it was searching for an eigenvector but with matrices. But an issue I always have is I don't know how to transform the polynomial into a matrix, expecially when I'm not given a specific polynomial.
 
trap101 said:
problem is I don't know how to put this into a matrix.

What is the basis for your space? HINT: you need to find 4 polynomials, such that any cubic polynomial can be expressed as their sum. There are many possible choices, but probably only a few natural ones.

Okay assuming you can answer that question here is how you would find a matrix representation for the operator. Say the basis is e_i for i = 1 ... 4.

Define the vectors v_j as

v_j = T (e_j)

So above line means you take your polynomials that you chose as your basis, and you apply the operator T to them which amounts to adding twice the polynomial to its own derivative.

Now since v_j must be cubic polynomials, you must be able to express them uniquely in the basis you chose. Say this representation is
v_j = b_{1j}e_1 + b_{2j}e_2 + b_{3j}e_3 + b_{4j}e_4
Your matrix is now given by T_{ij} = b_{ij}
 
Last edited:
Ahhh. Thank you. So to build on that question, because it always seems to pop up in some fashion or another. If I'm not given any vectors that I have to specifically apply my linear transformation to, should I just assume that I can use the standard basis vectors? Because my troubles always occur when I don't know what vectors should be chosen.

The reason I ask is because there is a similar question where i have to verify that the given vector is an eigenvector:

p = x3 ...under the same P3 conditions, but defined by: T(x) =
xp' - 4p. Now the eigenvector condition is T(x) = λx. So do I use the standard basis vectors of P3 in that transformation and try to obtain the λ that would prove this?
 
It shouldn't matter what intermediate basis you work in. When you re-express the eigen-vectors as polynomials they should be the same no matter what basis you chose. If an eigen-space for a particular eigen-value had dimension greater than 1, then the individual polynomials you get might be different but collectively they should span the same space.
 
Thanks. I'm going to give this all a try, hopefully there won't be any problems
 

Similar threads

Replies
5
Views
2K
Replies
11
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K