Undergrad Orthogonality of Eigenvectors of Linear Operator and its Adjoint

Click For Summary
In a finite-dimensional complex vector space with a Hermitian inner product, the discussion focuses on proving that a linear operator T is diagonalizable if and only if each eigenvector v of T has a corresponding eigenvector u of its adjoint T* such that their inner product <u, v> is non-zero. The straightforward direction assumes T is diagonalizable, allowing for the construction of a basis of V consisting of eigenvectors of T. The challenge lies in demonstrating the converse, which is considered plausible but not immediately clear. Participants express a need for clarification on the implications for the dual space V* and its basis in relation to the diagonalization of T. The conversation highlights the complexities involved in understanding the relationship between T and T* in the context of eigenvectors.
ughpleasenope
Messages
2
Reaction score
0
Suppose we have V, a finite-dimensional complex vector space with a Hermitian inner product. Let T: V to V be an arbitrary linear operator, and T^* be its adjoint.

I wish to prove that T is diagonalizable iff for every eigenvector v of T, there is an eigenvector u of T^* such that <u, v> is not equal to 0.

I've been thinking about generalized eigenvectors, but have not really gotten anywhere.
 
Physics news on Phys.org
The direction where you assume T is diagonalizable is pretty straightforward I think?

The other direction is not immediately obviously true to me but sounds plausible, I'll sleep on it.
 
  • Like
Likes ughpleasenope
Office_Shredder said:
The direction where you assume T is diagonalizable is pretty straightforward I think?

The other direction is not immediately obviously true to me but sounds plausible, I'll sleep on it.
Would you mind elaborating? I've struggled with this for a while.
 
If T is diagonalizable, then you can write down a basis of V which are all eigenvectors of T.

What kind of basis of ##V^*## do you get from this? (I guess if your class is very matrix based this question might not make sense)
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K