## Adjoint Operator Help

I was looking for a hint on a problem in my professor's notes (class is over and I was just auditing).

I want to show that if $T:V→V$ is a linear operator on finite dimensional inner product space, then if $T$ is diagonalizable (not necessarily orth-diagonalizable), so is the adjoint operator of $T$ (with respect to the inner product).

I think I should show that the eigenspaces of λ and $\overline{λ}$ have the same dimension (I know they are not the same since this is only true for normal operators), but I'm not sure if this is the right way to go.

Any small push in the right direction would help. Thanks very much.

EDIT: The definition here of diagonalizable is that there exists a basis, $\chi$, such that $[T]$$\chi$ is a diagonal matrix (i.e. the matrix representation of $T$ with respect to the basis is a diagonal matrix).

 PhysOrg.com science news on PhysOrg.com >> Hong Kong launches first electric taxis>> Morocco to harness the wind in energy hunt>> Galaxy's Ring of Fire
 Mentor You should post your definition of "diagonalizable".
 I included the definition in the edit I made above.

Mentor

## Adjoint Operator Help

OK, that is one of several definitions that we can work with. (Another one is that there exists an invertible matrix P such that ##P^{-1}TP## is diagonal). When we're using your definition, I think the easiest way is to just write down an explicit formula for the ij component of ##[T]_\chi## (with i and j arbitrary), that involves the basis vectors and the inner product. Can you do that?

Once you've done that, you just use the definition of the adjoint operator, and you're almost done.

In case you're wondering why I don't just tell you the complete answer, it's because the forum rules say that we should treat every textbook-style problem as homework. So I can only give you hints.

 Quote by fredrik ok, that is one of several definitions that we can work with. (another one is that there exists an invertible matrix p such that ##p^{-1}tp## is diagonal). When we're using your definition, i think the easiest way is to just write down an explicit formula for the ij component of ##[t]_\chi## (with i and j arbitrary), that involves the basis vectors and the inner product. Can you do that? Once you've done that, you just use the definition of the adjoint operator, and you're almost done. In case you're wondering why i don't just tell you the complete answer, it's because the forum rules say that we should treat every textbook-style problem as homework. So i can only give you hints.
Well if this were an orthonormal basis, I would know how to do it, but since it isn't then I'm not so sure. I'll have a think.

Not to worry, I didn't want an answer, just a hint.

Mentor
 Quote by sammycaps Well if this were an orthonormal basis,...
Ah, you're right. The simple solution I had in mind only works with orthonormal bases. I will have to think about it as well. (However, I think I still see a simple solution based on the other definition of diagonalizable).

 How are you defining "diagonal" when you say $PTP$-1 is diagonal?

Mentor
 Quote by sammycaps How are you defining "diagonal" when you say $PTP$-1 is diagonal?
Sorry, I'm so used to thinking of square matrices and linear operators on finite-dimensional vector spaces as "the same thing" that it didn't even occur to me that this is an issue. If we denote the vector space by X, there's always an isomorphism ##F:X\to\mathbb R^n##. It seems to make sense to call T "diagonal" if the matrix representation of ##F\circ T## with respect to the standard basis of ##\mathbb R^n## is a diagonal matrix.

Unfortunately, this seems to make the answer to the question "Is T diagonal?" depend on the choice of the isomorphism F. So maybe this was a bad idea.

 Blog Entries: 8 Recognitions: Gold Member Science Advisor Staff Emeritus How about showing that there exists a basis of eigenvectors for A*? If $\lambda$ is an eigenvalue is an eigenvalue of A, then $\overline{\lambda}$ is an eigenvalue of A*. So like in the OP, it suffices to show that the eigenspaces have the same dimension.
 Blog Entries: 8 Recognitions: Gold Member Science Advisor Staff Emeritus Try to use the relation $$Ker(A^*) = (Im(A))^\bot$$

 Quote by micromass Try to use the relation $$Ker(A^*) = (Im(A))^\bot$$
Well if the operator were normal (i.e. orthogonally diagonalizable) then we could restrict the operator to just the eigenspace and use the dimension theorem to arrive at the answer (I think). But the operator is not normal, so I'm still thinking about how to proceed. Thanks for the hint.

Blog Entries: 8
Recognitions:
Gold Member