Linear Indepedence of Egienvectors and Jordon blocks

In summary: If you want to find all the generalized eigenvectors for a given matrix, you can use the QR decomposition of the matrix. For example, if you have the matrixthen the QR decomposition of this matrix iswhere the lower triangular matrix R has the same entries as the matrix A. So the generalized eigenvectors for this matrix are all the vectors in the lower triangular matrix R.
  • #1
crazygrey
7
0
Hi everyone,
I had couple questions:
1) If I want to proof that egienvectors are linearly indpendent by induction, how do so? I do understand that I can start with a dimension of 1 and assume v1 to be a non zero vector so hence a linear indepedent, what do I do after that for other cases?

2) What the power of jordon block? for example (J)^k general case

Thanks
 
Physics news on Phys.org
  • #2
1) Since eigenvectors have no reason to be linearly indepedendent then you cannot do this. Perhaps you should check the wording of the question? There are some extra hypotheses you've omitted. So what are they and how can you use them?2) What have you done? Any examples? Have you tried computing some powers of jordan blocks? If not why not? If you just do it (try putting ones on the diagonal), you should be able to spot some patterns and then formulate an easy to prove conjecture.
 
  • #3
1) Basically the idea is that having different eigenvlaues will results in independent eigenvectors. I just want to prove this thoerm. Let v=[v1 v2 v3...vn] set of eigenvectors with distinct eigenvalues w1,w2,w3,...,wn. By induction:
if n=1, v1 has to be linearly independent since it is a non zero vector.
if n=2, a1v1+a2v2=0------1)
Multiply 1) by A knowing that Av=wv
we have a1w1v1+a2w2v2=0 ----2)
Multiply 1) by w1
a1w1v1+a2w1v2=0 ----3)
2)-3) a2(w2-w1)v2=0 , thus a2=0

I want to prove that this holds for all eigenvectors, how do I do that?

2) Knowing that Jordan Block is a representation of a square matrix A with repeated eigenvalues. I understand the pattern of the jordan block where the repeated eigenvalues are allocated in the diagonal and 1's in the superdiagonal. I know if I square the jordan block, the 1s starts shifting location, but I'm unable to obtain a general forumla of (J)^k knowing that A=(q)^-1*J*q where q is the set of eigenvectors. I hope I clarified what I meant.

Appreicate your help
 
Last edited:
  • #4
1) You cannot prove it holds for all eigenvectors because it is trivially false. (if v is an eigenvector so is 2v).

2) Sorry, but this is quite an easy question once you guess the formula and I strongly urge you to try a couple of examples, like working out the first few powers of

[1 1 0]
[0 1 1]
[0 0 1]
because the numbers tou will see are very very well known and will show you what to do. You can even do that in your head.

In fact doing

[k 1 0]
[0 k 1]
[0 0 k]
is probably just as easy and instructive. If you just do this you will learn a lot more than being told the answer.
 
Last edited:
  • #5
1) I wanted to prove that all eigenvectors in the vector space are linearly independent, not a multiplication of it with a scalar. If I have a set of vectors [v1,v2,v3,--vn] 1,2,3,..,n are indices...this set of vectors belong to a vector space , and I want to prove that all of them are linearly independent...

2) Thanks I will work it through ...
 
  • #6
crazygrey said:
1) I wanted to prove that all eigenvectors in the vector space are linearly independent, not a multiplication of it with a scalar. If I have a set of vectors [v1,v2,v3,--vn] 1,2,3,..,n are indices...this set of vectors belong to a vector space , and I want to prove that all of them are linearly independent...
You still haven't understood. "All eigenvectors" includes both an eigenvector v and any multiple of it so that's not what you want to say. You can't prove that "all eigenvectors are linearly independent"- it's not true. You can show that eigenvectors corresponding to distinct eigenvalues are independent. But then it doesn't follow that you will have a "complete set of eigenvectors", that is, a basis consisting of eigenvectors". If such a basis exists, then writing the linear operator as a matrix in that basis gives a diagonal matrix- and not all matrices are "diagonalizable"- that's why you need the "Jordan Normal form".
 
Last edited by a moderator:
  • #7
define a generalized eigenvector for a matrix A to be a vector v such that some power of A annihilates v. i.e. A^r v = 0 for some r > 0.then if the characteristic polynomial of A is (X-a1)(^r1) (X-a2)^r2...(X-as)^rs,
then A has r1 generalized eigenvectors for the value a1, r2 generalized eigenvectors for the value a2, etc...

if there are exactly t1 actual eigenvectors for a1, and t2 for a2, ...etc,

then in the jordan form for A, there are exactly t1 jordan blocks for a1, t2 jordan blocks for a2,.etc...the size of these blocks is determiend in a slightly complicated way by the dimensions of the kernels of the operators (A-a)^i for all i,...when ri = ti for all i, the matrix A is diagonalizable, i.e. if every generalized eigenvector is an actual eigenvector, i.e. if every vector annihilated by a power of (A-a) is annihilated by just (A-a) itself.
 
Last edited:
  • #8
you might want to read my 14 page treatment of all of linear algebra from scratch through jordan and artional forms on my webpage, with proofs.the indepedence of generalized eigenspaces with disticnct eigenvectors depends on the eucldiean algorithm as follows:let V(i) = the subspace of vector annihilated by (t-ci)^ri, where (t-ci)^ri is a factor in the minimal polynomiAL FOR THE OPERATOR T.

Then map the product of the V(i) into the original space V by addition, i.e. an m tuple (v1,...,vm) goes to their sum v1+...+vm in V.

The independence of these different VI IS INJECTIVITY OF THIS MAP.

So consider the product Qi of all factors (t-cj^rj for j ≠ i. These Qj are relatively prime so by euclids aLGORITHM THERE EXIST polynomials Pi such that summation PiQi = 1.

If v1+...+vm = 0, then each vi is dependent on the others. Since Qi annihilates all vj except vi, and every other Qj annihilates vi, applying summation PiQi to vi annihilates it. i.e. every term except PiQi annihilTES Vi, and that etrm annihilates all the other vj, but vi can be written in terms of thise vj, so Qi also annihilates vi.

but by the equation summation PiQi = 1, the right side equals vi. hence every vi = 0.

this does it.

a simialr arguemnt shows surjectivity, so V acually decomposes as a direct product of those generalized eigenspaces. this argument can be made to look more elementary, but no more understandable, and is probably given in your booka s an inductive argument.
 
Last edited:
  • #9
Really appreicate your help, that was very helpful . Thanks
 
  • #10
wow, how kind of you to say so.
 

What is the concept of linear independence in the context of eigenvectors and Jordan blocks?

Linear independence refers to the property of a set of vectors where none of the vectors in the set can be expressed as a linear combination of the other vectors. In the context of eigenvectors and Jordan blocks, this means that the eigenvectors in a set must be unique and cannot be written as a linear combination of other eigenvectors in the set.

How do I determine if a set of eigenvectors is linearly independent?

A set of eigenvectors is linearly independent if and only if the determinant of the matrix formed by these eigenvectors is non-zero. This can be determined using the method of Gaussian elimination or by calculating the determinant directly.

What is a Jordan block and how does it relate to linear independence of eigenvectors?

A Jordan block is a square matrix with a specific structure that is used to represent a single eigenvalue with a multiplicity greater than 1. In the context of linear independence of eigenvectors, Jordan blocks can be used to determine if a set of eigenvectors is linearly independent by checking if the corresponding Jordan matrices are diagonalizable.

Why is linear independence of eigenvectors important in the study of linear algebra?

Linear independence of eigenvectors is important because it allows us to easily solve systems of linear equations and perform other calculations involving matrices. Additionally, linear independence of eigenvectors is closely related to the concept of diagonalization, which is a useful tool in many applications of linear algebra.

Are there any real-world applications of linear independence of eigenvectors and Jordan blocks?

Yes, there are many real-world applications of linear independence of eigenvectors and Jordan blocks in fields such as physics, engineering, and computer science. For example, in quantum mechanics, eigenvectors and eigenvalues are used to represent the possible states and energies of a system, and linear independence is crucial in determining these properties. In computer graphics and image processing, eigenvectors and Jordan blocks are used in algorithms for image compression and feature extraction.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
965
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
979
Replies
27
Views
1K
Replies
9
Views
1K
Replies
24
Views
989
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top