Linear operators, eigenvalues, diagonal matrices

Click For Summary

Discussion Overview

The discussion revolves around linear operators, their eigenvalues, and the representation of these operators as matrices with respect to different bases in the context of linear algebra. Participants explore the implications of eigenvalues, diagonal matrices, and theorems related to these concepts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions how to represent a linear operator T as a diagonal matrix with respect to a specific basis and references a theorem about distinct eigenvalues leading to diagonalizability.
  • Another participant challenges the correctness of the matrix representation provided, indicating that the order of basis vectors affects the resulting matrix and eigenvalues.
  • There is a discussion about the relationship between the diagonal entries of a matrix and its eigenvalues, with one participant expressing confusion about the implications of a theorem stating that diagonal entries correspond to eigenvalues.
  • Participants note that while diagonal matrices have eigenvalues along the diagonal, not all matrices have distinct eigenvalues, and the presence of repeated eigenvalues is acknowledged.
  • One participant clarifies that every n by n matrix has n eigenvalues when considering multiplicities and complex numbers, but this does not guarantee n independent eigenvectors.
  • Another participant reflects on the implications of having repeated eigenvalues in the context of a specific linear operator example.

Areas of Agreement / Disagreement

Participants express differing views on the implications of eigenvalues and matrix representations, with some confusion remaining about the relationship between the order of basis vectors and the resulting eigenvalues. There is no consensus on the interpretation of theorems related to eigenvalues and diagonal matrices.

Contextual Notes

Participants highlight potential misunderstandings regarding the ordering of basis vectors and its impact on matrix representation and eigenvalues. The discussion also touches upon the distinction between diagonalizability and the existence of eigenvalues.

bjgawp
Messages
84
Reaction score
0
So I have a couple of questions in regards to linear operators and their eigenvalues and how it relates to their matrices with respect to some basis.

For example, I want to show that given a linear operator T such that T(x_1,x_2,x_3) = (3x_3, 2x_2, x_1) then T can be represented by a diagonal matrix with respect to some basis of V = \mathbb{R}^3.

So one approach is to use a theorem that says: If T has dim(V) distinct eigenvalues, then T has a diagonal matrix with respect to some basis V.

So we simply look for the solutions of: \lambda x_1 = 3 x_3, \lambda x_2 = 2 x_2, \lambda x_3 = x_1

So we find that \lambda = 2, \pm \sqrt 3 with some corresponding eigenvectors.So we use the theorem and finish off the question.

So finally, my problem with all this. The matrix of T with respect to the standard basis {(0,0,1), (0,1,0),(1,0,0)} (purposely out of order) would be:

\begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{bmatrix}

which is diagonal right? But I have a theorem that says that the eigenvalues of T with respect to a uppertriangular matrix "consist precisely of the entries on the diagonal of that upper-triangular matrix".

But I didn't find 3 and 1 as eigenvalues. Is it because I switched the standard basis elements out of order? Why would that matter? And doesn't this last theorem imply that every linear operator precisely has dim V eigenvalues :S?

It was a long one but any comments are appreciated!
 
Physics news on Phys.org
bjgawp said:
So finally, my problem with all this. The matrix of T with respect to the standard basis {(0,0,1), (0,1,0),(1,0,0)} (purposely out of order) would be:

\begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{bmatrix}
This is wrong. If we call the basis vectors v_1,v_2,v_3 (in the order you've written them), we have for example

T_{31}=(Tv_1)_3=(3v_3)_3=3(v_3)_3=3\cdot 1=3

The third component of v_3 in the basis you have defined is 1, not 0, because v_3=0\cdot v_1+0\cdot v_2+1\cdot v_3.
 
bjgawp said:
So I have a couple of questions in regards to linear operators and their eigenvalues and how it relates to their matrices with respect to some basis.

For example, I want to show that given a linear operator T such that T(x_1,x_2,x_3) = (3x_3, 2x_2, x_1) then T can be represented by a diagonal matrix with respect to some basis of V = \mathbb{R}^3.

So one approach is to use a theorem that says: If T has dim(V) distinct eigenvalues, then T has a diagonal matrix with respect to some basis V.

So we simply look for the solutions of: \lambda x_1 = 3 x_3, \lambda x_2 = 2 x_2, \lambda x_3 = x_1

So we find that \lambda = 2, \pm \sqrt 3 with some corresponding eigenvectors.So we use the theorem and finish off the question.

So finally, my problem with all this. The matrix of T with respect to the standard basis {(0,0,1), (0,1,0),(1,0,0)} (purposely out of order) would be:

\begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{bmatrix}
No, if you use that order in the "domain", you must also use it in the "range".
T(0, 0, 1)= (3, 0, 0)= 0(0, 0, 1)+ 0(0, 1, 0)+ 3(1, 0, 0) so the first column will be
\begin{bmatrix}0 \\ 0 \\ 3\end{bmatrix}
T(1, 0, 0)= (0, 0, 1)= 1(0, 0, 1)+ 0(0, 1, 0)+ 0(1, 0, 0) so the third column will be
\begin{bmatrix} 1 \\ 0 \\ 0\end{bmatrix}

In this basis, the matrix corresponding to T is
\begin{bmatrix}0 & 0 & 1 \\0 & 2 & 0\\ 3 & 0 & 0\end{bmatrix}

which is diagonal right? But I have a theorem that says that the eigenvalues of T with respect to a uppertriangular matrix "consist precisely of the entries on the diagonal of that upper-triangular matrix".

But I didn't find 3 and 1 as eigenvalues. Is it because I switched the standard basis elements out of order? Why would that matter? And doesn't this last theorem imply that every linear operator precisely has dim V eigenvalues :S?

It was a long one but any comments are appreciated!
 
Ah what a silly mistake I forgot to write each vector as a linear combination of the basis elements.

But then I'm still confused about the whole eigenvalue issue. I know I must be misinterpreting the theorem. The elements along the diagonal are "precisely" the eigenvalues of T. Doesn't this mean that all linear operators have exactly dimV eigenvalues? (Obviously not but I don't see why..)

Thanks a lot!
 
Last edited:
The elements along the main diagonal of a diagonal matrix or a "Jordan Normal form" for a non-diagonalizable matrix are the eigenvalues of the matrix but in general, it is much harder to find the eigenvalues of a matrix than just looking at the main diagonal!

It is, however, true that allowing for multiplicities and allowing complex numbers, that any n by n matrix has n eigenvalues- because its eigenvalue equation is an nth degee polynomial which can be factored into n linear factors over the complex numbers.

An n by n matrix may not have n independent eigenvectors.
 
Ah right. The entries along the diagonal of a upper triangular matrix need not be distinct.

So then for example: T(x_1, x_2, x_3) = (2x_1, 2x_2, 3x_3)

We only have 2 eigenvalues here but with respect to the standard basis, we would still have a diagonal matrix just with repeated elements along the diagonal. Right?

That clears things up a lot more. Thanks!
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K