Relationship between eigenspace and column space

Click For Summary
SUMMARY

The discussion confirms that an n by n matrix A with n linearly independent eigenvectors is indeed invertible, as these eigenvectors span n-space. However, the reverse is not true; an invertible matrix does not guarantee the existence of n linearly independent eigenvectors, as demonstrated by the matrix [[1, 1], [0, 1]]. Furthermore, the relationship between eigenspace and column space is established, indicating that the dimensions of the eigenspace and column space are equal when A is expressed in the form A = P'EP, where P' is the orthogonal matrix of eigenvectors and E is a diagonal matrix of eigenvalues.

PREREQUISITES
  • Understanding of eigenvalues and eigenvectors
  • Knowledge of matrix invertibility
  • Familiarity with matrix diagonalization
  • Concept of column space and row space in linear algebra
NEXT STEPS
  • Study the implications of eigenvalues being zero on matrix invertibility
  • Explore the concept of matrix diagonalization in depth
  • Learn about the relationship between row space and column space
  • Investigate the properties of orthogonal matrices and their applications
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, matrix theory, and eigenvalue problems. This discussion is also beneficial for educators teaching these concepts.

ybhan23
Messages
2
Reaction score
0
Is it true that if an n by n matrix A has n-linearly independent eigenvectors, then it must also be invertible because these n-eigenvectors span n-space. But does this reasoning work the other way around: that is if A is invertible, does that imply n-linearly independent eigenvectors can be found? More generally, is there some connection between the column space and eigenspace?

On a side note, here is an unrelated but interesting question: why is it that if the square of a matrix is the negative identity matrix (-I), then it implies that the matrix has an even number of columns and rows?

Thanks in advance
 
Physics news on Phys.org
ybhan23 said:
Is it true that if an n by n matrix A has n-linearly independent eigenvectors, then it must also be invertible because these n-eigenvectors span n-space.
What if 0 is an eigenvalue?

But does this reasoning work the other way around: that is if A is invertible, does that imply n-linearly independent eigenvectors can be found?
No. Consider ##\left( \begin{smallmatrix} 1 & 1 \\ 0 & 1 \end{smallmatrix} \right)##.

On a side note, here is an unrelated but interesting question: why is it that if the square of a matrix is the negative identity matrix (-I), then it implies that the matrix has an even number of columns and rows?

Thanks in advance
If A^2=-I then det(A^2)=det(-I). Try to see if you can use this to answer your question.
 
Thanks for your answer. The third question is completely clear to me now. But for the first and second questions, are there no connections between eigenspace and column space?
 
If A =P'EP where P'P = I and E is a diagonal matrix of eigenvalues, then:

A^-1 = P^-1* E^-1 * (P')^-1

But P^-1 = P' and (P')^-1 = P

So, A^-1 = P' * E^-1 * P

So as long as the eigenvalues are non-zero, P is the eigenvector for both A and A^-1.
-----------------------
Now, the column space is the row space of the transpose. Assuming again that A=P'EP then:

A' = (P'EP)' = P'E'P = P'EP

so the dimension of eigenspace of the column space and row space are equal. In the event that E is upper diagonal form (not a diagonal matrix), I believe a similar statement can be made.
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
10K
  • · Replies 23 ·
Replies
23
Views
2K