Proving Diagonalizability of Adjoint Operator on Finite Inner Product Space

  • Thread starter sammycaps
  • Start date
  • Tags
    Operator
In summary, I was looking for a hint on a problem in my professor's notes (class is over and I was just auditing).I want to show that if T:V→V is a linear operator on finite dimensional inner product space, then if T is diagonalizable (not necessarily orth-diagonalizable), so is the adjoint operator of T (with respect to the inner product).I think I should show that the eigenspaces of λ and \overline{λ} have the same dimension (I know they are not the same since this is only true for normal operators), but I'm not sure if this is the right way to go.Any small push in the right
  • #1
sammycaps
91
0
I was looking for a hint on a problem in my professor's notes (class is over and I was just auditing).

I want to show that if [itex]T:V→V[/itex] is a linear operator on finite dimensional inner product space, then if [itex]T[/itex] is diagonalizable (not necessarily orth-diagonalizable), so is the adjoint operator of [itex]T[/itex] (with respect to the inner product).

I think I should show that the eigenspaces of λ and [itex]\overline{λ}[/itex] have the same dimension (I know they are not the same since this is only true for normal operators), but I'm not sure if this is the right way to go.

Any small push in the right direction would help. Thanks very much.

EDIT: The definition here of diagonalizable is that there exists a basis, [itex]\chi[/itex], such that [itex][T][/itex][itex]\chi[/itex] is a diagonal matrix (i.e. the matrix representation of [itex]T[/itex] with respect to the basis is a diagonal matrix).
 
Last edited:
Physics news on Phys.org
  • #2
You should post your definition of "diagonalizable".
 
  • #3
I included the definition in the edit I made above.
 
  • #4
OK, that is one of several definitions that we can work with. (Another one is that there exists an invertible matrix P such that ##P^{-1}TP## is diagonal). When we're using your definition, I think the easiest way is to just write down an explicit formula for the ij component of ##[T]_\chi## (with i and j arbitrary), that involves the basis vectors and the inner product. Can you do that?

Once you've done that, you just use the definition of the adjoint operator, and you're almost done.

In case you're wondering why I don't just tell you the complete answer, it's because the forum rules say that we should treat every textbook-style problem as homework. So I can only give you hints.
 
  • #5
fredrik said:
ok, that is one of several definitions that we can work with. (another one is that there exists an invertible matrix p such that ##p^{-1}tp## is diagonal). When we're using your definition, i think the easiest way is to just write down an explicit formula for the ij component of ##[t]_\chi## (with i and j arbitrary), that involves the basis vectors and the inner product. Can you do that?

Once you've done that, you just use the definition of the adjoint operator, and you're almost done.

In case you're wondering why i don't just tell you the complete answer, it's because the forum rules say that we should treat every textbook-style problem as homework. So i can only give you hints.

Well if this were an orthonormal basis, I would know how to do it, but since it isn't then I'm not so sure. I'll have a think.

Not to worry, I didn't want an answer, just a hint.
 
Last edited:
  • #6
sammycaps said:
Well if this were an orthonormal basis,...
Ah, you're right. The simple solution I had in mind only works with orthonormal bases. I will have to think about it as well. (However, I think I still see a simple solution based on the other definition of diagonalizable).
 
  • #7
How are you defining "diagonal" when you say [itex]PTP[/itex]-1 is diagonal?
 
  • #8
sammycaps said:
How are you defining "diagonal" when you say [itex]PTP[/itex]-1 is diagonal?
Sorry, I'm so used to thinking of square matrices and linear operators on finite-dimensional vector spaces as "the same thing" that it didn't even occur to me that this is an issue. If we denote the vector space by X, there's always an isomorphism ##F:X\to\mathbb R^n##. It seems to make sense to call T "diagonal" if the matrix representation of ##F\circ T## with respect to the standard basis of ##\mathbb R^n## is a diagonal matrix.

Unfortunately, this seems to make the answer to the question "Is T diagonal?" depend on the choice of the isomorphism F. So maybe this was a bad idea.
 
Last edited:
  • #9
How about showing that there exists a basis of eigenvectors for A*?

If [itex]\lambda[/itex] is an eigenvalue is an eigenvalue of A, then [itex]\overline{\lambda}[/itex] is an eigenvalue of A*. So like in the OP, it suffices to show that the eigenspaces have the same dimension.
 
  • #10
Try to use the relation

[tex]Ker(A^*) = (Im(A))^\bot[/tex]
 
  • #11
micromass said:
Try to use the relation

[tex]Ker(A^*) = (Im(A))^\bot[/tex]

Well if the operator were normal (i.e. orthogonally diagonalizable) then we could restrict the operator to just the eigenspace and use the dimension theorem to arrive at the answer (I think). But the operator is not normal, so I'm still thinking about how to proceed. Thanks for the hint.
 
  • #12
sammycaps said:
Well if the operator were normal (i.e. orthogonally diagonalizable) then we could restrict the operator to just the eigenspace and use the dimension theorem to arrive at the answer (I think). But the operator is not normal, so I'm still thinking about how to proceed. Thanks for the hint.

What do you mean with the dimension theorem? Why can't we use it now?
 
  • #13
micromass said:
What do you mean with the dimension theorem? Why can't we use it now?

Actually I'm not sure. I thought there was something we needed from normal operators, but now I don't think so (well in a normal operator the eigenspaces of corresponding eigenvalues are the same, but this is stronger than I need anyway).
 
Last edited:

What is an adjoint operator?

An adjoint operator is a mathematical concept used in functional analysis and linear algebra. It is the dual of a linear operator, meaning it operates on the dual space of the original operator's domain. In simpler terms, it is a way to represent a linear operator in terms of its transpose or conjugate, allowing for easier calculation and analysis.

How is an adjoint operator related to an adjoint matrix?

An adjoint operator and an adjoint matrix are related, but not the same thing. An adjoint matrix is the transpose of the cofactor matrix of a square matrix. It is used to find the inverse of a matrix and is commonly used in solving systems of linear equations. On the other hand, an adjoint operator is used in functional analysis to represent a linear operator in terms of its dual space. Essentially, an adjoint matrix is a specific example of an adjoint operator.

What are some properties of an adjoint operator?

Some properties of an adjoint operator include linearity, self-adjointness, and norm preservation. Linearity means that the adjoint of a sum of operators is equal to the sum of their adjoints. Self-adjointness means that the adjoint of an operator is equal to itself, allowing for easier calculation. Norm preservation means that the adjoint of an operator preserves the norm, or size, of a vector.

How is an adjoint operator used in quantum mechanics?

In quantum mechanics, an adjoint operator is used to represent Hermitian operators, which are used to describe physical observables. These operators are self-adjoint, meaning their adjoint is equal to themselves, and have real eigenvalues, making them useful in quantum mechanics calculations. The use of adjoint operators in quantum mechanics allows for the representation of physical systems in a mathematical framework.

What are some applications of adjoint operators?

Adjoint operators have many applications in mathematics and physics. In addition to their use in functional analysis and quantum mechanics, they are also used in signal processing, differential equations, and optimization problems. Adjoint operators allow for the efficient calculation and analysis of linear operators, making them a valuable tool in many fields of science and engineering.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
885
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
963
  • Math POTW for Graduate Students
Replies
1
Views
545
  • Linear and Abstract Algebra
Replies
1
Views
2K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
873
  • Linear and Abstract Algebra
Replies
10
Views
925
Back
Top