Am I Making an Obvious Mistake with These Eigenvectors?

  • Context: Undergrad 
  • Thread starter Thread starter hnicholls
  • Start date Start date
  • Tags Tags
    Eigenvectors Matrix
Click For Summary

Discussion Overview

The discussion revolves around the calculation of eigenvectors for given matrices, specifically focusing on a diagonal matrix and a non-diagonal matrix. Participants explore the methods for finding eigenvectors, the implications of scalar multiples of eigenvectors, and the uniqueness of eigenvectors associated with eigenvalues.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant calculates the eigenvector for a diagonal matrix but questions the result, suggesting a potential mistake in their reasoning.
  • Another participant challenges the conclusion that a specific eigenvector must have a fixed value, emphasizing that eigenvectors can be scaled by any non-zero scalar.
  • It is noted that the standard basis vectors are already eigenvectors for diagonal matrices corresponding to their eigenvalues.
  • Participants discuss the method of using the characteristic equation to find eigenvalues and eigenvectors, questioning whether this method applies to diagonal matrices.
  • There is a clarification that the eigenbasis of a diagonal matrix is not uniquely determined, and the rank of the linear system relates to the dimensionality of the subspace spanned by eigenvectors.
  • One participant provides an example of a non-diagonal matrix and discusses the process of finding eigenvectors, highlighting the importance of correctly interpreting the resulting equations.
  • Another participant asks about the conditions for the infinite number of eigenvectors corresponding to a given eigenvalue, leading to a discussion about scalar multiples of eigenvectors.

Areas of Agreement / Disagreement

Participants express differing views on the interpretation of eigenvectors and the methods for calculating them. While some agree on the properties of eigenvectors, there remains uncertainty regarding the application of methods to diagonal versus non-diagonal matrices.

Contextual Notes

Limitations in understanding arise from the assumption that eigenvectors must have specific values rather than recognizing their scalar nature. The discussion also highlights the distinction between diagonal and non-diagonal matrices in the context of eigenvector calculations.

Who May Find This Useful

Students and practitioners in linear algebra, particularly those interested in eigenvalues and eigenvectors, may find this discussion relevant as it explores common misconceptions and clarifies the properties of eigenvectors in different matrix contexts.

hnicholls
Messages
49
Reaction score
1
I am evaluating the following 2 x 2 matrix:

|2 0 |
|0 3 |

with eigenvalues 2 and 3.

If I use 2 and calculate the eigenvector:

R - λI =

|2-λ 0 |
|0 3-λ |

R - λI =

|0 0 |
|0 1 |

|0 0 ||a| =
|0 1 ||b|

|0|
|0|


a = 0 and b = 1

So eigenvector of eigenvalue 2 is

|0|
|1|


but it seems that is ought to be

|1|
|0|

Am I making an obvious mistake?
 
Physics news on Phys.org
hnicholls said:
|0 0 ||a| =
|0 1 ||b|

|0|
|0|


a = 0 and b = 1
This conclusion is what you need to rethink.
 
From
[tex]\begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix}\begin{bmatrix}a \\ b\end{bmatrix}= \begin{bmatrix}0 \\ 0 \end{bmatrix}[/tex]
we get
[tex]\begin{bmatrix}0(a) \\ 1(b)\end{bmatrix}= \begin{bmatrix}0 \\ b\end{bmatrix}= \begin{bmatrix}0 \\ 0 \end{bmatrix}[/tex]

That does NOT say "a=0 and b= 1".
 
The matrix you are considering is already diagonal. This means that the basis:

[tex] \left[\begin{array}{c}<br /> 1 \\<br /> <br /> 0<br /> \end{array}\right], \left[\begin{array}{c}<br /> 0 \\<br /> <br /> 1<br /> \end{array}\right][/tex]

are already eigenvectors of the matrix corresponding to the eignevalues in the upper left and lower right corners, respectively.

Also:

hnicholls said:
|0 0 ||a| =
|0 1 ||b|

|0|
|0|

a = 0 and b = 1

is wrong. If you write down the equations corresponding to that matrix equality you will have:

[tex] \left\{\begin{array}{lcl}<br /> 0 & = & 0 \\<br /> <br /> b & = & 0<br /> \end{array}\right.[/tex]

As you can see, this puts a constraint on [itex]b[/itex] and not [itex]a[/itex].
 
Thank you all. These responses were very helpful.

I now have this questions: if there are no constraints on the value for a, how can we determine that the value for a is 1 for this diagonal matrix?
 
You can't. If [itex]\mathbf{x}[/itex] is an eigenvector of a matrix, then so is [itex]c \mathbf{x}[/itex] for any [itex]c \neq 0[/itex]. It is sometimes conventional to impose the normalization condition:

[tex] \mathbf{x}^{\dagger} \cdot \mathbf{x} = 1[/tex]

This fixes the magnitude of [itex]a[/itex], but not its phase (if we consider it as a complex number) or sign (if we consider it a real one). It is conventional to have `most' of the components with positive sign. That is why we choose the phase to be zero.
 
I understand that any scalar multiple of the basis eigenvector will also be an eigenvector, but what I am still confused by is that for matrices that are not diagonal we use the characteristic equation to compute the eigenvalues and then find the eigenvectors by calculating a matrix equation which represents a system of two linear equations, which in turn reduces to a single linear equation which is solved (in my example) for a and b.

Does this procedure not work if we have a diagonal matrix?

In other words, do diagonal matrices, where the eigenvectors are basis vectors (of the standard basis) not allow for a computation of their eigenvectors using the technique we would use for a matrix which is not diagonalized?
 
hnicholls said:
I understand that any scalar multiple of the basis eigenvector will also be an eigenvector, but what I am still confused by is that for matrices that are not diagonal we use the characteristic equation to compute the eigenvalues and then find the eigenvectors by calculating a matrix equation which represents a system of two linear equations, which in turn reduces to a single linear equation which is solved (in my example) for a and b.

Does this procedure not work if we have a diagonal matrix?

In other words, do diagonal matrices, where the eigenvectors are basis vectors (of the standard basis) not allow for a computation of their eigenvectors using the technique we would use for a matrix which is not diagonalized?

The eigenbasis of a matrix is not uniquely determined. Do you know what the term rank of a linear system means?
 
Yes. The maximum number of linearly independent rows or columns in a given matrix.
 
  • #10
Well, when you substitute a particular eigenvalue in the characteristic equation, the rank of the system becomes lower than the dimensionality of the matrix and the difference of these two numbers is equal to the dimensionality of the subspace spanned by the corresponding eigenvectors. Even if the corresponding subspaces are one-dimensional, a.k.a. the eigenvalues are non-degenerate, the norm is not determined.
 
  • #11
I understand your point. But this would be true of any operator matrix, not only true of diagonal operator matrices, right?
 
  • #12
For diagonal matrices, the standard basis is already the eigenbasis.
 
  • #13
hnicholls said:
I understand that any scalar multiple of the basis eigenvector will also be an eigenvector, but what I am still confused by is that for matrices that are not diagonal we use the characteristic equation to compute the eigenvalues and then find the eigenvectors by calculating a matrix equation which represents a system of two linear equations, which in turn reduces to a single linear equation which is solved (in my example) for a and b.

Does this procedure not work if we have a diagonal matrix?

In other words, do diagonal matrices, where the eigenvectors are basis vectors (of the standard basis) not allow for a computation of their eigenvectors using the technique we would use for a matrix which is not diagonalized?
You were already shown, in the first few responses, how that method does work when the matrix is diagonal! You just used it incorrectly. If you had done the same thing with a non-diagonal matrix you would have gotten the wrong eigenvectors for it.

The non-diagonal matrix
[tex]\begin{bmatrix}-1 & -2 \\ -6 & 6 \end{bmatrix}[/tex]

also has eigenvalues 2 and 3. To find an eigenvector corresponding to eigenvalue 2, row reduce
[tex]\begin{bmatrix}-1-2 & -2 \\ -6 & 6- 2\end{bmatrix}= \begin{bmatrix} -3 & -2 \\ -6 & 4\end{bmatrix}[/tex]

Row reduction of that immediately leads to
[tex]\begin{bmatrix}1 & \frac{2}{3}\\ 0 & 0 \end{bmatrix}\begin{bmatrix}a \\ b\end{bmatrix}= \begin{bmatrix}0 \\ 0\end{bmatrix}[/tex]

If you did the same thing as before, you would now conclude that b= 0 which is incorrect. What is true is that 0b= 0 so that b can be anything. You seem to be under the impression that there is one "correct" eigenvector corresponding to a given eigenvalue. As you have been told here, there are always an infinite number of eigenvectors corresponding to any eigenvalue.
 
Last edited by a moderator:
  • #14
Thank you. I think I am finally understanding this.

One last question. For a non-diagonal matrix:


|4 2 |
|3 -1 |

with eigenvalues 5 and -2.

If I use 5 and calculate the eigenvector:

R - λI =

|4-λ 2 |
|3 -1-λ |

R - λI =

|-1 2 |
| 3 -6 |

|-1 2 ||a| = |0|
|3 -6 ||b| = |0|

a = 2b
3a = 6b


a = 2 and b = 1

So eigenvector of eigenvalue 5 is

|2|
|1|

As there are an infinite number of eigenvectors corresponding to any eigenvalue, what would be another eigenvector that corresponds to this eigenvalue? What are the conditions for these infinite number of eigenvectors corresponding to this eigenvalue?
 
  • #15
These:

[tex] \left(\begin{array}{c}{2 \lambda \\ \lambda<br /> \end{array}\right)[/tex]
 
  • #16
Great. So using any scalar multiple of the basis eigenvector, we could have for this eigenvalue 5:

|4| or |8|
|2| or |4|, etc.

but not

|4|
|1|

right?
 
Last edited:
  • #17
Right.
 
  • #18
hnicholls said:
Great. So using any scalar multiple of the basis eigenvector, we could have for this eigenvalue 5:

|4| or |8|
|2| or |4|, etc.

but not

|4|
|1|

right?

Yes, with the exception of scalar multiplication by 0, as that would result in the zero vector. Also, it's your choice which pair of eigenvectors to use as a basis.
 
  • #19
In this particular example, yes.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K