Eigenvalue, Eigenvector and Eigenspace

  • Context: Graduate 
  • Thread starter Thread starter negation
  • Start date Start date
  • Tags Tags
    Eigenvalue Eigenvector
Click For Summary

Discussion Overview

The discussion revolves around the concepts of eigenvalues, eigenvectors, and eigenspaces in linear algebra. Participants explore the definitions and relationships between these concepts, including the implications of considering the zero vector in the context of eigenvectors. The conversation includes mathematical reasoning and clarification of terms, as well as some debate over definitions and interpretations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants assert that the eigenspace Eλ is the nullspace of the matrix N(λ.In-A) and question the validity of equating it to a single vector v where Av = 0.
  • Others challenge the idea of constructing an eigenspace from a single vector, emphasizing that an eigenspace is a set of vectors.
  • There is confusion regarding the interpretation of the statement [ N(λ.In-A) ] [Eλ=-1] = 0, with some suggesting that it could be true under certain interpretations.
  • Participants discuss the definition of eigenvectors, with some stating that the zero vector is not considered an eigenvector, while others argue that definitions vary and some texts include the zero vector in the definition of eigenvectors.
  • Questions are raised about the relationship between eigenvalues and eigenvectors, particularly whether eigenvalues correspond to specific eigenvectors and how to express this mathematically.
  • There is a discussion on the necessity of performing row reduction to reduced row echelon form (RREF) on matrices and the implications of using row echelon form (REF) instead.

Areas of Agreement / Disagreement

Participants express differing views on the definition of eigenvectors, particularly regarding the inclusion of the zero vector. There is no consensus on the interpretation of certain mathematical statements, and the discussion remains unresolved on several points, particularly concerning the construction of eigenspaces and the implications of definitions.

Contextual Notes

Some statements made by participants rely on specific interpretations of mathematical definitions, which may not be universally accepted. The discussion also highlights the complexity of relationships between eigenvalues, eigenvectors, and eigenspaces, with various assumptions and conditions present in the arguments.

negation
Messages
817
Reaction score
0
Let's say my eigenvalue λ=-1 and we assume eigenvector of zero are non-eigenvector.
An eigenspace is mathematically represented as Eλ = N(λ.In-A) which essentially states, in natural language, the eigenspace is the nullspace of a matrix.
N(λ.In-A) is a matrix.

Would it then be valid to say that the eigenspace, Eλ, whose eigenvalue, λ=-1, is the nullspace of the matrix, N(λ.In-A), is equivalent to the the vector , v, where
Av = 0.

If v is the nullspace of the matrix A then Av = 0, and similarly, if Eλ is the nullspace of a matrix, N(λ.In-A), then, it must equally be true that
[ N(λ.In-A) ] [Eλ=-1] = 0
 
Last edited:
Physics news on Phys.org
You mean you are trying to construct an eigenspace from a single vector?
 
It is not clear exactly what you mean by
[ N(λ.In-A) ] [Eλ=-1] = 0
Certainly some interpretations would make the statement true, but
[ N(λ.In-A) ] =[Eλ=-1]
is also true (some interpretations)
 
negation said:
Let's say my eigenvalue λ=-1 and we assume eigenvector of zero are non-eigenvector.
"We assume eigenvector of zero are non-eigenvector"? I don't know what this means. Do you mean that you are assuming that 0 is not an eigenvector or are you asserting that an "eigenvector" must not be the 0 vector?

An eigenspace is mathematically represented as Eλ = N(λ.In-A) which essentially states, in natural language, the eigenspace is the nullspace of a matrix.
N(λ.In-A) is a matrix.

Would it then be valid to say that the eigenspace, Eλ, whose eigenvalue, λ=-1, is the nullspace of the matrix, N(λ.In-A), is equivalent to the the vector , v, where
Av = 0.
No. First, it is a set of vectors, not a single vector. Second, it is the set of vectors such that (λ.In-A)v= 0, or equivalently, λ.Inv= Av, as you said, not "Av= 0".

If v is the nullspace of the matrix A then Av = 0, and similarly, if Eλ is the nullspace of a matrix, N(λ.In-A), then, it must equally be true that
[ N(λ.In-A) ] [Eλ=-1] = 0
I can't make sense out of that. You have, as part of your hypothesis, "If v is (in) the nullspace of the matrix A" but you conclusion has no "v" in it.
 
HallsofIvy said:
"We assume eigenvector of zero are non-eigenvector"? I don't know what this means. Do you mean that you are assuming that 0 is not an eigenvector or are you asserting that an "eigenvector" must not be the 0 vector?


No. First, it is a set of vectors, not a single vector. Second, it is the set of vectors such that (λ.In-A)v= 0, or equivalently, λ.Inv= Av, as you said, not "Av= 0".


I can't make sense out of that. You have, as part of your hypothesis, "If v is (in) the nullspace of the matrix A" but you conclusion has no "v" in it.

You probably can't because it doesn't make sense. It's still a new topic to me so I'd figure the fastest way to refine my understanding is to state what I know and have it corrected where the understanding is flawed.

I'm stating that a zero vector is not within the consideration of an eigenvector as I've learnt.

Let's see:

λ is an eigenvalue of the eigenvetor V IFF det(λIn-A) = 0

Say we have a matrix [1 2;4 3] and det((λIn-A) = 0 gives
λ1=5 and λ2 = -1

1) Does this then states that λ1=5 and λ2 = -1 are eigenvalues corresponding to the eigenvector V?

2) Eλ12=Null(λIn-A) is interpreted as the set of vector with eigenvalues of λ1 and λ2 that maps
the matrix (λIn-A) to the zero vector. True?

3) where λ1=-1:

I have Eλ1=N(λIn-A) = N(-1[1 0; 0 1] - [1 2; 4 3])
= N([-2 -2; -4 -4])

4) (I shall take only λ1 as an example. Since Eλ1 maps the matrix [-2 -2; -4 -4] to the zero vector then could it then be written
as AV=0
where A = [1 2; 4 3] and V is the set of vector V = {v1, v2,...vn} that maps the matrix A to the zero vector?

If I could, then, matrix A after RREF(is it a necessarily condition to perform RREF on matrix A? what is the implication if I perform REF instead of RREF?)
becomes [ 1 1; 0 0 ] and V = {v1, v2}
and so,
[ 1 1; 0 0] [v1;v2] = [0;0]
 
negation said:
You probably can't because it doesn't make sense. It's still a new topic to me so I'd figure the fastest way to refine my understanding is to state what I know and have it corrected where the understanding is flawed.

I'm stating that a zero vector is not within the consideration of an eigenvector as I've learnt.
By definition, an eigenvector is nonzero.
negation said:
Let's see:

λ is an eigenvalue of the eigenvetor V IFF det(λIn-A) = 0

Say we have a matrix [1 2;4 3] and det((λIn-A) = 0 gives
λ1=5 and λ2 = -1

1) Does this then states that λ1=5 and λ2 = -1 are eigenvalues corresponding to the eigenvector V?
These are eigenvalues of that matrix. Each eigenvalue has its own eigenvector. For the eigenvalue λ1 = 5, an eigenvector associated with this eigenvalue is any nonzero solution x of the matrix equation Ax = 5x. Equivalently, x is a nonzero solution of (A - 5I)x = 0.
negation said:
2) Eλ12=Null(λIn-A) is interpreted as the set of vector with eigenvalues of λ1 and λ2 that maps
the matrix (λIn-A) to the zero vector. True?
Not sure. This is a pretty cumbersome way to say it.
For a given eigenvalue λ of a matrix A, x is an eigenvector if it is a nonzero solution of the equation Ax = λx.
This is equivalent to (A - λI)x = 0, so x is any nonzero vector in the nullspace of A - λI.
negation said:
3) where λ1=-1:

I have Eλ1=N(λIn-A) = N(-1[1 0; 0 1] - [1 2; 4 3])
= N([-2 -2; -4 -4])


4) (I shall take only λ1 as an example. Since Eλ1 maps the matrix [-2 -2; -4 -4] to the zero vector then could it then be written
as AV=0
where A = [1 2; 4 3] and V is the set of vector V = {v1, v2,...vn} that maps the matrix A to the zero vector?
No. You're not looking for solutions of Ax = 0 - you're looking for solutions of (A - λI)x = 0.
negation said:
If I could, then, matrix A after RREF(is it a necessarily condition to perform RREF on matrix A? what is the implication if I perform REF instead of RREF?)
becomes [ 1 1; 0 0 ] and V = {v1, v2}
and so,
[ 1 1; 0 0] [v1;v2] = [0;0]
 
Mark44 said:
By definition, an eigenvector is nonzero.

I'm going to jump in here- Yes, many textbooks define an eigenvector to be non-zero. But many others define "Eigenvalue" by "\lambda is an eigenvalue for linear operator A if there exist a non-zero vector, v, such that Av= \lambda v" and then define an Eigenvector, corresponding to eigenvalue \lambda, to be any vector, v, such that Av= \lambda v, which includes the 0 vector.

Personally, I prefer that because it allows us to say "the set of all eigenvectors, corresponding
to eigenvalue \lambda, is a subspace" rather than having to say "the set of all eigenvectors, corresponding to eigenvalue \lambda, together with the 0 vector, is a subspace"
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 2 ·
Replies
2
Views
7K
  • · Replies 9 ·
Replies
9
Views
2K