I was looking for a hint on a problem in my professor's notes (class is over and I was just auditing).

I want to show that if $T:V→V$ is a linear operator on finite dimensional inner product space, then if $T$ is diagonalizable (not necessarily orth-diagonalizable), so is the adjoint operator of $T$ (with respect to the inner product).

I think I should show that the eigenspaces of λ and $\overline{λ}$ have the same dimension (I know they are not the same since this is only true for normal operators), but I'm not sure if this is the right way to go.

Any small push in the right direction would help. Thanks very much.

EDIT: The definition here of diagonalizable is that there exists a basis, $\chi$, such that $[T]$$\chi$ is a diagonal matrix (i.e. the matrix representation of $T$ with respect to the basis is a diagonal matrix).
 PhysOrg.com science news on PhysOrg.com >> New language discovery reveals linguistic insights>> US official: Solar plane to help ground energy use (Update)>> Four microphones, computer algorithm enough to produce 3-D model of simple, convex room
 Mentor You should post your definition of "diagonalizable".
 I included the definition in the edit I made above.

Mentor

OK, that is one of several definitions that we can work with. (Another one is that there exists an invertible matrix P such that ##P^{-1}TP## is diagonal). When we're using your definition, I think the easiest way is to just write down an explicit formula for the ij component of ##[T]_\chi## (with i and j arbitrary), that involves the basis vectors and the inner product. Can you do that?

Once you've done that, you just use the definition of the adjoint operator, and you're almost done.

In case you're wondering why I don't just tell you the complete answer, it's because the forum rules say that we should treat every textbook-style problem as homework. So I can only give you hints.

 Quote by fredrik ok, that is one of several definitions that we can work with. (another one is that there exists an invertible matrix p such that ##p^{-1}tp## is diagonal). When we're using your definition, i think the easiest way is to just write down an explicit formula for the ij component of ##[t]_\chi## (with i and j arbitrary), that involves the basis vectors and the inner product. Can you do that? Once you've done that, you just use the definition of the adjoint operator, and you're almost done. In case you're wondering why i don't just tell you the complete answer, it's because the forum rules say that we should treat every textbook-style problem as homework. So i can only give you hints.
Well if this were an orthonormal basis, I would know how to do it, but since it isn't then I'm not so sure. I'll have a think.

Not to worry, I didn't want an answer, just a hint.

Mentor
 Quote by sammycaps Well if this were an orthonormal basis,...
Ah, you're right. The simple solution I had in mind only works with orthonormal bases. I will have to think about it as well. (However, I think I still see a simple solution based on the other definition of diagonalizable).
 How are you defining "diagonal" when you say $PTP$-1 is diagonal?

Mentor
 Quote by sammycaps How are you defining "diagonal" when you say $PTP$-1 is diagonal?
Sorry, I'm so used to thinking of square matrices and linear operators on finite-dimensional vector spaces as "the same thing" that it didn't even occur to me that this is an issue. If we denote the vector space by X, there's always an isomorphism ##F:X\to\mathbb R^n##. It seems to make sense to call T "diagonal" if the matrix representation of ##F\circ T## with respect to the standard basis of ##\mathbb R^n## is a diagonal matrix.

Unfortunately, this seems to make the answer to the question "Is T diagonal?" depend on the choice of the isomorphism F. So maybe this was a bad idea.
 Mentor Blog Entries: 8 How about showing that there exists a basis of eigenvectors for A*? If $\lambda$ is an eigenvalue is an eigenvalue of A, then $\overline{\lambda}$ is an eigenvalue of A*. So like in the OP, it suffices to show that the eigenspaces have the same dimension.
 Mentor Blog Entries: 8 Try to use the relation $$Ker(A^*) = (Im(A))^\bot$$

 Quote by micromass Try to use the relation $$Ker(A^*) = (Im(A))^\bot$$
Well if the operator were normal (i.e. orthogonally diagonalizable) then we could restrict the operator to just the eigenspace and use the dimension theorem to arrive at the answer (I think). But the operator is not normal, so I'm still thinking about how to proceed. Thanks for the hint.

Mentor
Blog Entries: 8
 Quote by sammycaps Well if the operator were normal (i.e. orthogonally diagonalizable) then we could restrict the operator to just the eigenspace and use the dimension theorem to arrive at the answer (I think). But the operator is not normal, so I'm still thinking about how to proceed. Thanks for the hint.
What do you mean with the dimension theorem? Why can't we use it now?

 Quote by micromass What do you mean with the dimension theorem? Why can't we use it now?
Actually I'm not sure. I thought there was something we needed from normal operators, but now I don't think so (well in a normal operator the eigenspaces of corresponding eigenvalues are the same, but this is stronger than I need anyway).

 Similar discussions for: Adjoint Operator Help Thread Forum Replies Advanced Physics Homework 1 Advanced Physics Homework 0 Calculus & Beyond Homework 1 Quantum Physics 9 Introductory Physics Homework 11