Linear Algebra - Self-adjoint Operators

Click For Summary

Homework Help Overview

The discussion revolves around the properties of self-adjoint operators in the context of linear algebra, specifically focusing on the operator T defined on the inner-product space P2(R) with a specified inner product. The original poster is tasked with demonstrating that T is not self-adjoint and reconciling this with the observation that its matrix representation equals its conjugate transpose.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to find the adjoint operator T* by analyzing the inner product relationships. Some participants question the correctness of the inner product calculation and explore the implications of the matrix representation of T being equal to its conjugate transpose despite T not being self-adjoint.

Discussion Status

Participants have confirmed the original poster's reasoning regarding T not being self-adjoint based on inner product evaluations. There is ongoing exploration of why the matrix representation does not lead to a contradiction, with some guidance provided regarding the dependence on the choice of basis.

Contextual Notes

There are indications of confusion regarding the definition of the inner product and the implications of matrix properties in relation to self-adjointness. The discussion reflects a need for clarity on these concepts as they relate to the specific operator and its matrix representation.

steelphantom
Messages
158
Reaction score
0

Homework Statement


Make P2(R) into an inner-product space by defining <p, q> = [tex]\int_0^1p(x)q(x)dx[/tex]. Define T in L(P2(R)) by T(a_0 + a_1*x + a_2*x2) = a_1*x.

(a) Show that T is not self-adjoint.
(b) The matrix of T with respect to the basis (1, x, x2) is

[tex]\left(<br /> \begin{array}{ccc}<br /> 0 & 0 & 0\\<br /> 0 & 1 & 0\\<br /> 0 & 0 & 0<br /> \end{array}<br /> \right)[/tex]

This matrix equals its conjugate transpose, even though T is not self-adjoint. Explain why this is not a contradiction.

Homework Equations



The Attempt at a Solution


To show that T is not self-adjoint, I need to show that T != T*. This amounts to explicitly finding what T* is. I know that <Tv, w> = <v, T*w>.

So we have <T(a_0 + a_1*x + a_2*x2), b_0 + b_1*x + b_2*x2> = <a_1*x, b_0 + b_1*x + b_2*x2> = a_1*x(b_0 + b_1*x + b_2*x2). Now I can't quite figure out where to go from here. I know this has to equal <a_0 + a_1*x + a_2*x2, T*(b_0 + b_1*x + b_2*x2)>. How can I find T*?

For part (b), I know that a if an operator is self-adjoint, its matrix equals its conjugate transpose, but not necessarily the other way around. This is a crappy way of putting it, though. Maybe I'll have a better explanation once I find T*. Thanks for any help!
 
Physics news on Phys.org
:rolleyes: I wasn't paying attention to the problem and was calculating the inner-product wrong. It's obviously defined to be that integral between 0 and 1, not the dot product as I was doing before.
 
Can I just do this:

An operator is self-adjoint iff <Tv, w> = <v, Tw> for all v, w. So using the defined inner product, we have <T(a_0 + a_1*x + a_2*x2), b_0 + b_1*x + b_2*x2> = (1/2)*a_1*b_0 + (1/3)a_1*b_1 + (1/4)*a_1*b_2.

But we also have <a_0 + a_1*x + a_2*x2, T(b_0 + b_1*x + b_2*x2)> = (1/2)*a_0*b_1 + (1/3)a_1*b_1 + (1/4)*a_2*b_1.

Clearly <Tv, w> != <v, Tw>, so T is not self-adjoint. Is this correct?
 
That's correct. You could even pick an easier example. <1,T(x)>=<1,x>!=0 and <T(1),x>=<0,x>=0. Not self adjoint.
 
Dick said:
That's correct. You could even pick an easier example. <1,T(x)>=<1,x>!=0 and <T(1),x>=<0,x>=0. Not self adjoint.

Cool. Thanks for the confirmation. Now I'm trying to figure out why I can say that even though my matrix equals its conjugate transpose, it is not a contradiction. I'm thinking it has to do with how sparse the matrix is, specifically on the diagonal, but I'm not sure. Anyone? Thanks!
 
steelphantom said:
I know that a if an operator is self-adjoint, its matrix equals its conjugate transpose,
Why do you think that?
 
steelphantom said:
Cool. Thanks for the confirmation. Now I'm trying to figure out why I can say that even though my matrix equals its conjugate transpose, it is not a contradiction. I'm thinking it has to do with how sparse the matrix is, specifically on the diagonal, but I'm not sure. Anyone? Thanks!

It has to do with which basis you use to express the matrix. The property of matrices being conjugate transposes is not independent of the choice of basis. This is the 'basis' of Hurkyl's skepticism.
 
I see now. Thanks for the clarification. :smile:
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
4
Views
2K
Replies
9
Views
2K
Replies
2
Views
2K
Replies
8
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K