Eigenvectors and their inverses

  • Thread starter Thread starter Niles
  • Start date Start date
  • Tags Tags
    Eigenvectors
Niles
Messages
1,834
Reaction score
0
[SOLVED] Eigenvectors and their inverses

Homework Statement


After submitting the first question, I thought of a new one - so there are two questions:

1) I have a n x n matrix A and it has n (not necessarily different) eigenvalues. I can write the matrix A as the product of:

S*D*S^(-1),

where D is the diagonalmatrix which has the eigenvalues as it's entries. S contains the eigenvalues (in the same order as they are written in D) and S^-1 is the inverse of S.

Some places I see they write it as S^(-1)*D*S, and some places as S*D*S^(-1). Is it always the matrix to the left of D that contains the eigenvectors?

2) If I have a matrix A that represents a transformation L from R^4 -> R given by [1 -1 3 0], then how can I determine if L is linear from A?

Thanks in advance,

sincerely Niles.
 
Last edited:
Physics news on Phys.org
Comment on 1)
If A is nxn and it does not have n distint eigenvalues (as you are saying) then you cannot diagonalize it. This is a theorem.

For instance if you have a matrix (A B; C D) and it has eigenvalues E, F then you can diagonalise it (assume you find eigenvector). The diagonzable matrix would be of form (E 0; 0 F);

For 2)
Use definition of linear transformation: closed under scalar addition and multiplication. If the results are the same then it is a linear transformation.
Thanks

Asif
 
I was a bit puzzled by "eigenvectors" and their inverses"!

Niles said:

Homework Statement


After submitting the first question, I thought of a new one - so there are two questions:

1) I have a n x n matrix A and it has n (not necessarily different) eigenvalues. I can write the matrix A as the product of:

S*D*S^(-1),

where D is the diagonalmatrix which has the eigenvalues as it's entries. S contains the eigenvalues (in the same order as they are written in D) and S^-1 is the inverse of S.
S contains the eigenvectors corresponding to the eigenvalues of D, in that order.
Also, those eigenvectors can be chosen so that S is orthonormal.

Some places I see they write it as S^(-1)*D*S, and some places as S*D*S^(-1). Is it always the matrix to the left of D that contains the eigenvectors?
Yes, it is the left matrix, S, in SDS-1= A, that contains the eigenvectors of A as columns. S-1 then contains eigenvectors of A as rows- though not necessairily the same eigenvectors. Of course, if the eigenvectors are chosen so that S is orthonormal, then S= S-1.

2) If I have a matrix A that represents a transformation L from R^4 -> R given by [1 -1 3 0], then how can I determine if L is linear from A?
?? Matrix multiplication is always linear and so determines a linear transformation. A transformation is linear if and only if it can be written as a matrix.

Thanks in advance,

sincerely Niles.
 
Last edited by a moderator:
First, thanks to both of you.

Mr. HallsofIvy:

"Also, those eigenvectors can be chosen so that S is orthonormal." - that is only if A is symmetric, and by using Gram-Schmidt on the eigenvectors belonging to distinct eigenvalues, right? I believe that's what you told me in a previous post.

In S^(-1)*D*S = A, is it the right matrix, S, that containts the eigenvectors as columns?
 
Last edited:
asif zaidi said:
Comment on 1)
If A is nxn and it does not have n distint eigenvalues (as you are saying) then you cannot diagonalize it. This is a theorem.
No, there is no such theorem. An nxn matrix is diagonalizable if and only if there exist a "complete set" of eigenvectors- n independent eigenvectors. It does not matter if the eignvalues are distinct- as long as there are enough independent eigenvectors corresponding to the multiple eigenvalue. Then we can choose those eigenvectors as basis for the space and the matrix, in that basis, is diagonal. For example, the matrix
\left[\begin{array}{ccc} 1 & 0 & 0 \\ 0 & 1 & 0\\ 0 & 0 & 1\end{array}\right]
is trivially "diagonalizable" but has only 1 as eigenvalue. Of course, the independent vectors <1, 0, 0>, <0, 1, 0> and <0, 0, 1> are all eigenvectors corresponding to eigenvalue 1.

For instance if you have a matrix (A B; C D) and it has eigenvalues E, F then you can diagonalise it (assume you find eigenvector). The diagonzable matrix would be of form (E 0; 0 F);
But the question was about whether the matrix, S, "containing" the eigenvectors (I assume that means "having the eigenvectors as columns") should be on the left or right: is it A= SDS-1 or A= S-1DS- and you don't answer that.

For 2)
Use definition of linear transformation: closed under scalar addition and multiplication. If the results are the same then it is a linear transformation.
And, of course, a matrix multiplication always does satisfy those.



Thanks

Asif
 
HallsofIvy said:
Of course, if the eigenvectors are chosen so that S is orthonormal, then S= S-1.
If S is orthonormal then S-1=ST, not S.
 
Niles said:
First, thanks to both of you.

Mr. HallsofIvy:

"Also, those eigenvectors can be chosen so that S is orthonormal." - that is only if A is symmetric, and by using Gram-Schmidt on the eigenvectors belonging to distinct eigenvalues, right? I believe that's what you told me in a previous post.

In S^(-1)*D*S = A, is it the right matrix, S, that containts the eigenvectors as columns?

Yes, A must be symmetric. I am so used to working with symmetric matrices (the easy situation!) that I didn't think. Sorry.

No, I thought I said that it had to be the left matrix. Of course, a more common problem is, given A, find D. In that case, multplying SDS-1= A on the left by S-1 and on the right by S, we get D= S-1AS. Going that way, S is "on the right" and S-1 is on the left.
 
D H said:
If S is orthonormal then S-1=ST, not S.
Yes, I meant to say "orthogonal" or "Hermitian".
 
Back
Top