Help to understand this eigenvector case

  • Context: MHB 
  • Thread starter Thread starter ognik
  • Start date Start date
  • Tags Tags
    Eigenvector
Click For Summary

Discussion Overview

The discussion revolves around understanding eigenvectors and their properties, particularly in the context of a specific practice problem involving a row-reduced matrix that is an identity matrix. Participants explore the implications of finding eigenvectors, the conditions under which they are defined, and the nature of eigenspaces.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested

Main Points Raised

  • One participant describes a practice problem where they found eigenvalues and eigenvectors, leading to confusion about the nature of the eigenspace and the validity of the zero vector as an eigenvector.
  • Another participant emphasizes that finding eigenvectors involves calculating the determinant of \(xI - A\) and solving the resulting system, noting that the system is typically under-determined.
  • A participant asserts that eigenvectors must be non-zero and questions the validity of the zero vector as an eigenvector.
  • Another participant references a lemma stating that an eigenspace contains the zero vector but argues that this does not imply the zero vector is an eigenvector.
  • One participant clarifies that an eigenspace consists of all eigenvectors corresponding to a specific eigenvalue, along with the zero vector, but reiterates that the zero vector itself is not considered an eigenvector.

Areas of Agreement / Disagreement

Participants generally disagree on the status of the zero vector in relation to eigenvectors. While there is a consensus that eigenvectors must be non-zero, there is contention regarding the implications of eigenspaces containing the zero vector.

Contextual Notes

There is an unresolved discussion about the definitions and properties of eigenvectors and eigenspaces, particularly regarding the inclusion of the zero vector and the conditions under which eigenvectors are defined.

ognik
Messages
626
Reaction score
2
Did a practice problem finding eigenvalues &-vectors, ended with this row-reduced matrix: $ \begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}$; Solving to get the eigenvectors, I could get $x_1 = x_2 = x_3$, in which case the eigenspace would be the zero vector.

Instead the answer is $ \begin{bmatrix}0\\0\\1\end{bmatrix}$ I think its because I need to choose 1 independent variable(right word?) and a method is to set the last variable to some real t - $x_3=t$, then $x_1 = x_2 = 0t$ ...but above I found $x_3 = 0$? So I know to set $x_3$ to t - but it doesn't make sense to me in this particular case where there are no dependent variables?

(I can see that it is an identity matrix, which is also a Markov matrix, but I don't thank those are relevant?)
 
Physics news on Phys.org
It's not clear "what" matrix you row-reduced, nor why, so it's impossible to verify you did it correctly.

That said, finding eigenvectors usually is a two-step process:

Step 1-given the matrix $A$, calculate the determinant of $xI - A$ (this will be a polynomial in $x$).

Step 2 - for any root of that polynomial, say $\lambda$, solve the system of equations:

$(\lambda I - A)v = 0$.

This system will typically be "under-determined", since we know that $\lambda I - A$ is singular (why?).

Typically, one will get a solution of the form (in $3$ dimensions, for example): $(at,bt,ct) = t(a,b,c)$. Any non-zero value of $t$ gives an eigenvector. Often, one of $a,b,c$ is chosen to be $1$, and it is often possible to have all $3$ coordinates be chosen to be integers.
 
Hi - I'm sure it's correct - and it's not the first I've encountered, where solving $ (λI−A)v=0 $ could have all the variables = 0. (it is singular 'cos we found eigenvalues for which the characteristic poly = 0, in turn from det(λI−A)=0)

I know that the method is to choose one variable (in this case $v_3$) to be 1, so in this case we get t(0,0,1). But I'd like to understand why t(0,0,0) is not valid (because solving $ (λI−A)v=0 $ gives $v_1 = v_2 = v_3 = 0)?
 
Eigenvectors are always non-zero.
 
Deveno said:
Eigenvectors are always non-zero.

Hi, just came across a lemma that says 'an eigenspace is a subs-space - which I knew. But it then goes on to say

"Proof. An eigenspace must be nonempty — for one thing it contains the zero vector — and so we need only check closure"

Doesn't this suggest that the zero vector CAN be an eigenvector?
 
An eigenspace is composed of all eigenvectors belonging to a certain eigenvalue, AND the zero vector (which is NOT an eigenvector).
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 27 ·
Replies
27
Views
2K