Linear Algebra: Eigenvectors and Orthonormal Bases

Niles
Messages
1,834
Reaction score
0

Homework Statement


Consider a symmetric (and hence diagonalizable) n x n matrix A. The eigenvectors of A are all linearly independant, and hence they span the eigenspace Rn.

Since the matrix A is symmetric, there exists an orthonormal basis consisting of eigenvectors.

My questions are:

1) Will this orthonormal basis of eigenvectors also span the same space Rn?

2) If two vectors are linearly independant, will they also be orthogornal?
 
Physics news on Phys.org
For 1): when does a set of vectors span the vector space? Do the eigenvectors satisfy these conditions? [Actually, you already gave the answer yourself... do you see where? ]

For 2): Consider (1, 0) and (1, 1) in R2.
 
1) They ar elinearly dependant, so yes - I guess I answered my own question there!

2) Great, a counter-example, so no. Thanks!
 
1) Yep, it follows from the fact that "the eigenvectors are linearly independent" and that there are n of them. That is, they form a basis, as you said in the question. And of course a basis always spans the space (even a non-orthogonal and/or not normalized one).
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top