Orthogonality of Eigenvectors of Linear Operator and its Adjoint

In summary, the conversation discusses the proof of the diagonalizability of a linear operator T in a finite-dimensional complex vector space V with a Hermitian inner product. The proof involves showing that for every eigenvector of T, there exists an eigenvector of its adjoint T^* with a non-zero inner product. The direction where T is assumed to be diagonalizable is straightforward, while the other direction requires further consideration. The use of generalized eigenvectors is mentioned, but not fully explored. Overall, the conversation centers around the concept of diagonalizability and its implications for T and its adjoint T^*.
  • #1
ughpleasenope
2
0
Suppose we have V, a finite-dimensional complex vector space with a Hermitian inner product. Let T: V to V be an arbitrary linear operator, and T^* be its adjoint.

I wish to prove that T is diagonalizable iff for every eigenvector v of T, there is an eigenvector u of T^* such that <u, v> is not equal to 0.

I've been thinking about generalized eigenvectors, but have not really gotten anywhere.
 
Physics news on Phys.org
  • #2
The direction where you assume T is diagonalizable is pretty straightforward I think?

The other direction is not immediately obviously true to me but sounds plausible, I'll sleep on it.
 
  • Like
Likes ughpleasenope
  • #3
Office_Shredder said:
The direction where you assume T is diagonalizable is pretty straightforward I think?

The other direction is not immediately obviously true to me but sounds plausible, I'll sleep on it.
Would you mind elaborating? I've struggled with this for a while.
 
  • #4
If T is diagonalizable, then you can write down a basis of V which are all eigenvectors of T.

What kind of basis of ##V^*## do you get from this? (I guess if your class is very matrix based this question might not make sense)
 

1. What is the definition of orthogonality in the context of eigenvectors?

Orthogonality refers to the property of two vectors being perpendicular to each other. In the context of eigenvectors, it means that the eigenvectors of a linear operator and its adjoint are perpendicular to each other.

2. Why is the orthogonality of eigenvectors important in linear algebra?

The orthogonality of eigenvectors is important because it allows us to simplify calculations and solve complex problems. It also helps us to understand the behavior of linear operators and their adjoints.

3. How can we prove the orthogonality of eigenvectors of a linear operator and its adjoint?

To prove the orthogonality of eigenvectors, we can use the inner product of two vectors and show that it equals zero. In other words, we can show that the dot product of two eigenvectors is equal to zero, which indicates that they are perpendicular to each other.

4. What is the significance of the eigenvalues in the orthogonality of eigenvectors?

The eigenvalues of a linear operator and its adjoint play a crucial role in determining the orthogonality of eigenvectors. If the eigenvalues are distinct, then the corresponding eigenvectors will be orthogonal. However, if the eigenvalues are repeated, the eigenvectors may or may not be orthogonal.

5. Can the orthogonality of eigenvectors be extended to non-linear operators?

No, the orthogonality of eigenvectors only applies to linear operators and their adjoints. Non-linear operators do not have eigenvectors, and therefore, the concept of orthogonality does not apply to them.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
943
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
2K
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
209
Back
Top