Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Double dual/Double Transpose Question

  1. Nov 3, 2006 #1
    The question that I am stuck on is:
    Show that if X" (double dual of X) is identified with X and U" (double dual of U) with U via the duality relation, then T" (double transpose) = T.
    (Duality relation is f(L) = L (x) where f is in X", L is in X', and x is in X)

    So far, here is my work:

    We know:

    T: X --> U is a linear homogeneous map
    T': U' --> X' where U' is the dual of U and X' is the dual of X
    T": X" --> U" where X" is the double dual of X and U" is the double dual of U.

    Also, X" is isomorphic to X, and U" is isomorphic to U.

    I am missing something here, however. This is where I am stuck. How can one deduce that, in fact, T" = T? How do we show that two linear homogenoue maps are the equivalent?
    The idea of a double dual has left me slightly confused and any help would REALLY be appreciated.

  2. jcsd
  3. Nov 3, 2006 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You can show two linear maps are equal in exactly the same way as any other function: by proving that T(x) = T"(x) for all x.
  4. Nov 3, 2006 #3
    Let me check with you if this is correct...

    Let T(x) = u
    Then, we have to show that T"(f) = u where f is in X"?

    We know the following:
    For f in X", L in X', T"(f)(L) = fT'(L)
    This implies: T"(f)(L) = T' (Lx) (by duality relation)
    Then, T"(f)(L) = LT(x)
    Then, T"(f)(L) = Lu
    But, this does not equal u. What am I doing wrong here?
    Last edited: Nov 3, 2006
  5. Nov 3, 2006 #4


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Strictly speaking, just as f is assumed to be the double dual of x, you need to show that T"(f) is the double dual of u.

    That's wrong.

    T"(f) is an element of U".
    L is an element of X'.
    Therefore, one cannot evaluate T"(f) at L.
  6. Nov 3, 2006 #5
    I'm very confused right now.
    By definition, isn't (T"(f))(L) = f(T'(l)?

    ...Can you give me any suggestions as to how to approach this problem?
  7. Nov 3, 2006 #6


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Short answer: you're using something from the wrong space. You need to consider an element of U', since T"(f) is an element of U".

    Honestly, I too find functions of functions confusing. (And worse, you're dealing with functions of functions of functions!)

    My solution is simply to try and be extra precise and write everything down that I know -- in particular, I try to (at least mentally) write down what set everything lives in, and if it's a function, I write down what it's domain and its range is.

    One cute little trick that works in this particular example is to have elements of X" act on the right. In particular, you would write:


    and not


    This has the benefit of the suggestively similar notation:

    Lf = Lx

    I'm not going to do that in what follows, though.

    (Changing letters to reduce possible confusion)

    If you have a map:

    S : Y --> Z

    then you have a map

    S' : Z' --> Y'

    Suppose g is an element of Z'. Then S'(g) is an element of Y'. In particular:

    S'(g) : Y ---> F (where F is your base field)

    So, S'(g) has to take something in Y as its argument. (Not something in Z -- that's the mistake you were making)

    And, IIRC, the definition is:

    S'(g)(y) = g(S(y))

    (look at the type of everything, and make sure that the whole expression makes sense. e.g. note that S(y) is an element of Z)

    (P.S. you have to assume that X and U are finite dimensional spaces. I just want to make sure you know that)
    Last edited: Nov 3, 2006
  8. Nov 3, 2006 #7
    Ok, so let me start over again....

    I have:
    T: X --> U
    T': U' --> X'
    T": X" --> U"

    Then, take f in X"
    Therefore, T"(f) is in U".
    Then, T"(f)(L) = f(T'(L)) for L in U'
    Then T"(f)(L) = f(LT) because T'(L) = LT
    Thus, T"(f)(L) = LT(x) by duality relation
    Finally, T"(f)(L) = Lu, which is in U" by the duality relation
    So, T"(f) does indeed map to Lu, which is in U".

    Now, is it enough to say that since X" is isomorphic to X (dim X" = dim X) and U" is isomorphic to U (dim U" = dim U), then T" = T. Or is there a step I am missing here? (I feel like I am missing something, but I'm not sure)
  9. Nov 3, 2006 #8
    Is there anything else I need to do for this problem other than what I stated in the previous post?
    Last edited: Nov 4, 2006
  10. Nov 4, 2006 #9
    Does anyone know if what I have stated 2 posts ago is the correct answer?
  11. Nov 4, 2006 #10


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    This last line is correct. T"(f) isn't mapped to Lu. (T"(f) maps L to Lu) The important thing is that, since L is arbitrary, T"(f) is the dual of u.

    So, you've proven that if T(x) = u, then T"(x") = u". When they say:

    X" is identified with X​

    they mean that (wave hands a bit) we are considering x" = x to be an actual equality. (similarly, that u" = u) And that is what's required to show T" = T.

    If you feel uncomfortable with that level of imprecision, it should be okay to simply remember that:
    T"(x") = u" iff T(x) = u
  12. Nov 4, 2006 #11
    I finally get it...:smile:
    Thanks for your help!!!!
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook