Relationship between eigenspace and column space

Click For Summary

Discussion Overview

The discussion centers on the relationship between eigenspace and column space of matrices, particularly focusing on the implications of having linearly independent eigenvectors and the conditions for invertibility. It also touches on a related question regarding the properties of matrices whose square equals the negative identity matrix.

Discussion Character

  • Exploratory
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose that if an n by n matrix A has n-linearly independent eigenvectors, it must be invertible since these eigenvectors span n-space.
  • Others question this reasoning by asking what happens if 0 is an eigenvalue, suggesting that the presence of a zero eigenvalue would contradict invertibility.
  • One participant provides a counterexample of an invertible matrix that does not have n-linearly independent eigenvectors, specifically citing the matrix ##\left( \begin{smallmatrix} 1 & 1 \\ 0 & 1 \end{smallmatrix} \right)##.
  • There is a query about the existence of connections between eigenspace and column space, with one participant expressing uncertainty about this relationship.
  • A mathematical formulation is presented regarding the inverse of a matrix in terms of its eigenvectors and eigenvalues, suggesting that non-zero eigenvalues imply a relationship between the eigenvectors of A and A^-1.
  • Another participant notes that the column space is related to the row space of the transpose, indicating that dimensions of eigenspace for column and row spaces may be equal under certain conditions.

Areas of Agreement / Disagreement

Participants express differing views on the implications of eigenvectors for invertibility, with no consensus reached on the connection between eigenspace and column space. The discussion remains unresolved regarding the broader implications of these relationships.

Contextual Notes

Participants have not fully explored the implications of eigenvalues being zero or the specific conditions under which the relationships between eigenspace and column space hold. There are also assumptions made about the forms of matrices that have not been explicitly stated.

ybhan23
Messages
2
Reaction score
0
Is it true that if an n by n matrix A has n-linearly independent eigenvectors, then it must also be invertible because these n-eigenvectors span n-space. But does this reasoning work the other way around: that is if A is invertible, does that imply n-linearly independent eigenvectors can be found? More generally, is there some connection between the column space and eigenspace?

On a side note, here is an unrelated but interesting question: why is it that if the square of a matrix is the negative identity matrix (-I), then it implies that the matrix has an even number of columns and rows?

Thanks in advance
 
Physics news on Phys.org
ybhan23 said:
Is it true that if an n by n matrix A has n-linearly independent eigenvectors, then it must also be invertible because these n-eigenvectors span n-space.
What if 0 is an eigenvalue?

But does this reasoning work the other way around: that is if A is invertible, does that imply n-linearly independent eigenvectors can be found?
No. Consider ##\left( \begin{smallmatrix} 1 & 1 \\ 0 & 1 \end{smallmatrix} \right)##.

On a side note, here is an unrelated but interesting question: why is it that if the square of a matrix is the negative identity matrix (-I), then it implies that the matrix has an even number of columns and rows?

Thanks in advance
If A^2=-I then det(A^2)=det(-I). Try to see if you can use this to answer your question.
 
Thanks for your answer. The third question is completely clear to me now. But for the first and second questions, are there no connections between eigenspace and column space?
 
If A =P'EP where P'P = I and E is a diagonal matrix of eigenvalues, then:

A^-1 = P^-1* E^-1 * (P')^-1

But P^-1 = P' and (P')^-1 = P

So, A^-1 = P' * E^-1 * P

So as long as the eigenvalues are non-zero, P is the eigenvector for both A and A^-1.
-----------------------
Now, the column space is the row space of the transpose. Assuming again that A=P'EP then:

A' = (P'EP)' = P'E'P = P'EP

so the dimension of eigenspace of the column space and row space are equal. In the event that E is upper diagonal form (not a diagonal matrix), I believe a similar statement can be made.
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 7 ·
Replies
7
Views
10K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K