Linear Indepedence of Egienvectors and Jordon blocks


by crazygrey
Tags: blocks, egienvectors, indepedence, jordon, linear
crazygrey
crazygrey is offline
#1
Oct1-06, 03:07 PM
P: 7
Hi everyone,
I had couple questions:
1) If I want to proof that egienvectors are linearly indpendent by induction, how do so? I do understand that I can start with a dimension of 1 and assume v1 to be a non zero vector so hence a linear indepedent, what do I do after that for other cases?

2) What the power of jordon block? for example (J)^k general case

Thanks
Phys.Org News Partner Science news on Phys.org
Cougars' diverse diet helped them survive the Pleistocene mass extinction
Cyber risks can cause disruption on scale of 2008 crisis, study says
Mantis shrimp stronger than airplanes
matt grime
matt grime is offline
#2
Oct2-06, 01:42 AM
Sci Advisor
HW Helper
P: 9,398
1) Since eigenvectors have no reason to be linearly indepedendent then you cannot do this. Perhaps you should check the wording of the question? There are some extra hypotheses you've omitted. So what are they and how can you use them?


2) What have you done? Any examples? Have you tried computing some powers of jordan blocks? If not why not? If you just do it (try putting ones on the diagonal), you should be able to spot some patterns and then formulate an easy to prove conjecture.
crazygrey
crazygrey is offline
#3
Oct2-06, 12:31 PM
P: 7
1) Basically the idea is that having different eigenvlaues will results in independent eigenvectors. I just want to prove this thoerm. Let v=[v1 v2 v3.......vn] set of eigenvectors with distinct eigenvalues w1,w2,w3,......,wn. By induction:
if n=1, v1 has to be linearly independent since it is a non zero vector.
if n=2, a1v1+a2v2=0------1)
Multiply 1) by A knowing that Av=wv
we have a1w1v1+a2w2v2=0 ----2)
Multiply 1) by w1
a1w1v1+a2w1v2=0 ----3)
2)-3) a2(w2-w1)v2=0 , thus a2=0

I want to prove that this holds for all eigenvectors, how do I do that?

2) Knowing that Jordan Block is a representation of a square matrix A with repeated eigenvalues. I understand the pattern of the jordan block where the repeated eigenvalues are allocated in the diagonal and 1's in the superdiagonal. I know if I square the jordan block, the 1s starts shifting location, but I'm unable to obtain a general forumla of (J)^k knowing that A=(q)^-1*J*q where q is the set of eigenvectors. I hope I clarified what I meant.

Appreicate your help

matt grime
matt grime is offline
#4
Oct2-06, 01:11 PM
Sci Advisor
HW Helper
P: 9,398

Linear Indepedence of Egienvectors and Jordon blocks


1) You cannot prove it holds for all eigenvectors because it is trivially false. (if v is an eigenvector so is 2v).

2) Sorry, but this is quite an easy question once you guess the formula and I strongly urge you to try a couple of examples, like working out the first few powers of

[1 1 0]
[0 1 1]
[0 0 1]
because the numbers tou will see are very very well known and will show you what to do. You can even do that in your head.

In fact doing

[k 1 0]
[0 k 1]
[0 0 k]
is probably just as easy and instructive. If you just do this you will learn a lot more than being told the answer.
crazygrey
crazygrey is offline
#5
Oct2-06, 01:26 PM
P: 7
1) I wanted to prove that all eigenvectors in the vector space are linearly independent, not a multiplication of it with a scalar. If I have a set of vectors [v1,v2,v3,--vn] 1,2,3,..,n are indices...this set of vectors belong to a vector space , and I want to prove that all of them are linearly independent...

2) Thanks I will work it through ...
HallsofIvy
HallsofIvy is offline
#6
Oct2-06, 02:15 PM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,898
Quote Quote by crazygrey
1) I wanted to prove that all eigenvectors in the vector space are linearly independent, not a multiplication of it with a scalar. If I have a set of vectors [v1,v2,v3,--vn] 1,2,3,..,n are indices...this set of vectors belong to a vector space , and I want to prove that all of them are linearly independent...
You still haven't understood. "All eigenvectors" includes both an eigenvector v and any multiple of it so that's not what you want to say. You can't prove that "all eigenvectors are linearly independent"- it's not true. You can show that eigenvectors corresponding to distinct eigenvalues are independent. But then it doesn't follow that you will have a "complete set of eigenvectors", that is, a basis consisting of eigenvectors". If such a basis exists, then writing the linear operator as a matrix in that basis gives a diagonal matrix- and not all matrices are "diagonalizable"- that's why you need the "Jordan Normal form".
mathwonk
mathwonk is offline
#7
Oct4-06, 11:04 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,428
define a generalized eigenvector for a matrix A to be a vector v such that some power of A annihilates v. i.e. A^r v = 0 for some r > 0.


then if the characteristic polynomial of A is (X-a1)(^r1) (X-a2)^r2...(X-as)^rs,
then A has r1 generalized eigenvectors for the value a1, r2 generalized eigenvectors for the value a2, etc...

if there are exactly t1 actual eigenvectors for a1, and t2 for a2, ...etc,

then in the jordan form for A, there are exactly t1 jordan blocks for a1, t2 jordan blocks for a2,.etc....


the size of these blocks is determiend in a slightly complicated way by the dimensions of the kernels of the operators (A-a)^i for all i,....


when ri = ti for all i, the matrix A is diagonalizable, i.e. if every generalized eigenvector is an actual eigenvector, i.e. if every vector annihilated by a power of (A-a) is annihilated by just (A-a) itself.
mathwonk
mathwonk is offline
#8
Oct7-06, 10:16 AM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,428
you might want to read my 14 page treatment of all of linear algebra from scratch through jordan and artional forms on my webpage, with proofs.


the indepedence of generalized eigenspaces with disticnct eigenvectors depends on the eucldiean algorithm as follows:


let V(i) = the subspace of vector annihilated by (t-ci)^ri, where (t-ci)^ri is a factor in the minimal polynomiAL FOR THE OPERATOR T.

Then map the product of the V(i) into the original space V by addition, i.e. an m tuple (v1,...,vm) goes to their sum v1+...+vm in V.

The independence of these different VI IS INJECTIVITY OF THIS MAP.

So consider the product Qi of all factors (t-cj^rj for j ≠ i. These Qj are relatively prime so by euclids aLGORITHM THERE EXIST polynomials Pi such that summation PiQi = 1.

If v1+...+vm = 0, then each vi is dependent on the others. Since Qi annihilates all vj except vi, and every other Qj annihilates vi, applying summation PiQi to vi annihilates it. i.e. every term except PiQi annihilTES Vi, and that etrm annihilates all the other vj, but vi can be written in terms of thise vj, so Qi also annihilates vi.

but by the equation summation PiQi = 1, the right side equals vi. hence every vi = 0.

this does it.

a simialr arguemnt shows surjectivity, so V acually decomposes as a direct product of those generalized eigenspaces. this argument can be made to look more elementary, but no more understandable, and is probably given in your booka s an inductive argument.
crazygrey
crazygrey is offline
#9
Oct9-06, 01:09 PM
P: 7
Really appreicate your help, that was very helpful . Thanks
mathwonk
mathwonk is offline
#10
Oct10-06, 06:43 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,428
wow, how kind of you to say so.


Register to reply

Related Discussions
blocks Introductory Physics Homework 1
Linear Algebra: Linear Transformation and Linear Independence Calculus & Beyond Homework 8
Gauss-jordon elimination Calculus & Beyond Homework 3
Blocks A & B Introductory Physics Homework 11
Three Blocks Introductory Physics Homework 9