1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra - Self-adjoint Operators

  1. Apr 17, 2008 #1
    1. The problem statement, all variables and given/known data
    Make P2(R) into an inner-product space by defining <p, q> = [tex]\int_0^1p(x)q(x)dx[/tex]. Define T in L(P2(R)) by T(a_0 + a_1*x + a_2*x2) = a_1*x.

    (a) Show that T is not self-adjoint.
    (b) The matrix of T with respect to the basis (1, x, x2) is

    0 & 0 & 0\\
    0 & 1 & 0\\
    0 & 0 & 0

    This matrix equals its conjugate transpose, even though T is not self-adjoint. Explain why this is not a contradiction.

    2. Relevant equations

    3. The attempt at a solution
    To show that T is not self-adjoint, I need to show that T != T*. This amounts to explicitly finding what T* is. I know that <Tv, w> = <v, T*w>.

    So we have <T(a_0 + a_1*x + a_2*x2), b_0 + b_1*x + b_2*x2> = <a_1*x, b_0 + b_1*x + b_2*x2> = a_1*x(b_0 + b_1*x + b_2*x2). Now I can't quite figure out where to go from here. I know this has to equal <a_0 + a_1*x + a_2*x2, T*(b_0 + b_1*x + b_2*x2)>. How can I find T*?

    For part (b), I know that a if an operator is self-adjoint, its matrix equals its conjugate transpose, but not necessarily the other way around. This is a crappy way of putting it, though. Maybe I'll have a better explanation once I find T*. Thanks for any help!
  2. jcsd
  3. Apr 17, 2008 #2
    :rolleyes: I wasn't paying attention to the problem and was calculating the inner-product wrong. It's obviously defined to be that integral between 0 and 1, not the dot product as I was doing before.
  4. Apr 17, 2008 #3
    Can I just do this:

    An operator is self-adjoint iff <Tv, w> = <v, Tw> for all v, w. So using the defined inner product, we have <T(a_0 + a_1*x + a_2*x2), b_0 + b_1*x + b_2*x2> = (1/2)*a_1*b_0 + (1/3)a_1*b_1 + (1/4)*a_1*b_2.

    But we also have <a_0 + a_1*x + a_2*x2, T(b_0 + b_1*x + b_2*x2)> = (1/2)*a_0*b_1 + (1/3)a_1*b_1 + (1/4)*a_2*b_1.

    Clearly <Tv, w> != <v, Tw>, so T is not self-adjoint. Is this correct?
  5. Apr 17, 2008 #4


    User Avatar
    Science Advisor
    Homework Helper

    That's correct. You could even pick an easier example. <1,T(x)>=<1,x>!=0 and <T(1),x>=<0,x>=0. Not self adjoint.
  6. Apr 17, 2008 #5
    Cool. Thanks for the confirmation. Now I'm trying to figure out why I can say that even though my matrix equals its conjugate transpose, it is not a contradiction. I'm thinking it has to do with how sparse the matrix is, specifically on the diagonal, but I'm not sure. Anyone? Thanks!
  7. Apr 17, 2008 #6


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Why do you think that?
  8. Apr 17, 2008 #7


    User Avatar
    Science Advisor
    Homework Helper

    It has to do with which basis you use to express the matrix. The property of matrices being conjugate transposes is not independent of the choice of basis. This is the 'basis' of Hurkyl's skepticism.
  9. Apr 17, 2008 #8
    I see now. Thanks for the clarification. :smile:
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Linear Algebra - Self-adjoint Operators