1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra (Vector spaces, linear independent subsets, transformations)

  1. Mar 26, 2009 #1
    Assignment question:
    Let [tex]V = P (R)[/tex] and for j >= 1 define [tex]T_j(f(x)) = f^j (x)[/tex]
    where [tex]f^j(x)[/tex] is the jth derivative of f(x). Prove that the
    set {[tex]T_1, T_2,..., T_n [/tex]} is a linearly independent subset of [tex] L(V)[/tex]
    for any positive integer n.

    I have no idea how V= P(R) has anything to do with the rest of the problem- in particular with the transformation [tex]T_j(f(x)) = f^j (x)[/tex]. I guess I just don't understand how V ties in with the definition of the transformation T.

    2. Relevant ideas:
    I heard two different professors each saying a different method of proving this problem. One said it was possible to prove it by contradiction, and another tried to help me prove it directly.

    So when I started the method of contradiction I had:
    Let B = {[tex] a_1T_1, a_2T_2, ..., a_nT_n [/tex]}
    Assume B is linearly dependent (goal: show a contradiction that its dependent?)
    Choose [tex]T_i[/tex] from B such that 1<= i <= n.
    Therefore, [tex]T_i = B - a_iT_i[/tex]

    From here I couldn't continue. I have to show that the linear combination above (to the right of the equality) takes elements to the same ith derivative- and derive a contradiction here???
    ----------------

    The second method involve find the bases of V. A professor said it was {[tex]1, x/1!, x^2/2!, ..., x^n/n![/tex]}. But that doesn't make sense to me. I thought the basis would be {[tex]0, x, x^2,... x^n[/tex]}.
     
    Last edited: Mar 26, 2009
  2. jcsd
  3. Mar 26, 2009 #2
    I'm going to assume that P(R) is the vector space of polynomials with real coefficients under regular addition and scalar multiplication. If so, what do you mean by L(V)?
     
  4. Mar 26, 2009 #3

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    If the T's are linearly dependent then there is a linear combination a1*T1+a2*T2+...an*Tn that gives you zero when applied to any polynomial p(x), right? Ok, then apply that to specific polynomials and evaluate the result at x=0. Say p(x)=x. T1(x)=1, T2(x)=0, ... What does that tell you about a1? Suppose p(x)=x^2. T1(x^2)=2x, T2(x^2)=2, T2(x^2)=0. ... Now set x=0. What does that tell you about a2? Etc. Etc.
     
  5. Mar 26, 2009 #4

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    V=P(R). I think L(V) is linear transformations from V->V.
     
  6. Mar 26, 2009 #5
    My notes from class says "it is the set of linear operations on V".
     
  7. Mar 26, 2009 #6
    Dick, you are saying the exact same thing one of my professors was saying to me. However I don't understand what you mean "that give you zero when applied to any polynomial p(x)...


    Thanks,


    JL
     
  8. Mar 26, 2009 #7

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    They are operators in L(V). If they are linearly dependent then there's a linear combination of those operators that is IDENTICALLY zero. I.e. the combination of those operators applied to anything in the space is zero. That's what linearly dependent means. If you can show any combination must be nonzero someplace, then you've shown they are linearly independent.
     
    Last edited: Mar 26, 2009
  9. Mar 26, 2009 #8
    Oh so p(x) is defined by P(R), where V = P(R)? But how do you know p(x) = x, how do you know the domain equals the image?

    Thanks for the help- it takes me awhile to learn these things,


    JL
     
  10. Mar 26, 2009 #9

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    S'ok. I don't know p(x)=x. p(x) could be anything. I'm just picking examples for p(x) that I can use to show the various a's must be zero.
     
  11. Mar 26, 2009 #10
    Oh ok I see kinda. But why are we setting x = 0 again? and I found [tex]a_1 = 1, a_2 = 2, a_3 = 6, a_4 = 24, ..., a_n = n![/tex] So im guessing the set of all these values constitute the independent set? Would this reasoning along with what you provided give me enough to write a proof?

    JL
     
  12. Mar 27, 2009 #11

    lanedance

    User Avatar
    Homework Helper

    how did you get the a values?
    i could be very wrong, but from Dick's comments i thought you were trying to show all the a's must be identically zero... this would give you a contradiction with your initial assumption that the set of operators are linearly dependent... and actually show they are linearly independent
     
  13. Mar 27, 2009 #12

    Mark44

    Staff: Mentor

    Just a comment on something you said in the first post that I don't believe anyone has caught:
    You can never have 0 in a basis. 1, yes, but 0, no. Other than that, each vector (i.e., function) in your basis is a constant multiple of the corresponding member of the other basis.
     
  14. Mar 27, 2009 #13
    Thanks everyone, I solve this problem.


    JL
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Linear Algebra (Vector spaces, linear independent subsets, transformations)
Loading...