1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra: Linear Transformation and Linear Independence

  1. Sep 16, 2007 #1
    1. The problem statement, all variables and given/known data
    Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent


    2. Relevant equations



    3. The attempt at a solution
    I have no idea where to start with this proof, I've been looking over past theorems to gather information, but I'm not sure how to connect them to show that

    S is linearly independent. Can anyone give a hint or suggestion

    Attempt 1:

    Since T is linear, we know that
    T(x + y) = T(x) + T(y)
    T(ax) = aT(x)

    Since

    {w1, w2,..., wk} is linearly independent subset of R(T)

    aw1 + bw2 + ... + cwk = 0
    a = b = c = 0


    T(v1 +...+ vk) = T(v1) +...+ T(vk) = w1 +...+ wk (and since linearly independent)
    0w1 +...+ 0wk = 0,

    so then i worked backwards

    0 = 0w1 +...+ 0wk = 0T(v1) +...+ 0T(vk) = 0T(v1 +...+ vk) = T(0v1 +...+ 0vk)

    and hence 0v1 + ...+ 0vk = 0

    therefore S = {v1,v2,...vk} is linearly independent?



    Attempt 2:

    since R(T) is a subspace of W
    and {w1, w2,..., wk} is a subset of R(T)
    then span{w1, w2,..., wk} is a subset of R(T)

    I was hoping to show that S is a basis, and hence S is linearly independent
    but i couldn't get to that


    T(vi) = wi

    does that mean each vector, wi, can be written as a unique linear combination of vi

    and hence vi is a basis? thus S is linearly independent?

    i don't think my methods are correct

    any suggestions would be helpful, thanks alot






    NEW ATTEMPT:

    Alrite, I think I got it

    but when you said use an

    "Indirect Proof", I used proof by contradiction and obtained this


    Problem:
    Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent

    S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k --> S is linearly independent

    I assumed the negation

    S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k and S is Linearly Dependent

    Since S is linearly dependent

    a1v1 + a2v2 + ... akvk = 0
    such that there exists a nonzero coefficient

    then as you suggested, by taking T of both sides

    T(a1v1 + a2v2 + ... akvk) = T(0)

    and since T is linear

    a1 T(v1) + ... + ak T(vk) = 0 [since T(0)=0 ]

    then our other assumption, T(vi)=wi

    implies that

    a1w1 +...+ akwk = 0

    Since there existed a nonzero coefficient,
    that implies that

    {w1, w2, ... , wk} is Linearly DEPENDENT

    which contradicts the statement that {w1, w2, ... , wk} is Linearly Independent

    thus the original statement is true

    is this correct?
     
    Last edited: Sep 16, 2007
  2. jcsd
  3. Sep 16, 2007 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    No, you have shown that 0v1+ 0v2+ ...+ 0vk= 0 but we knew that anyway!
    Try an "indirect proof". If v1, v2, ..., vk are NOT independent then there exist
    a1, a2,..., ak, NOT all 0, such that a1v1+ a2v2+ ...+ akvk= 0. What happens if you take T of both sides of that?


    No, it doesn't. wi may not even be in vector space V.

    There is nothing in here to suggest that either {w1, w2,..., wk} or {w1, w2,...,wk} is a basis only that they are independent. Try the way I suggested.
     
  4. Sep 16, 2007 #3
    Alrite, I think I got it

    but when you said use an

    "Indirect Proof", I used proof by contradiction and obtained this


    Problem:
    Let V and W be vector spaces, Let T: V --> W be linear, and let {w1, w2,..., wk} be linearly independent subset of R(T). Prove that if S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k, then S is linearly independent

    S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k --> S is linearly independent

    I assumed the negation

    S = {v1,v2,...vk} is chosen so that T(vi) = wi, for i = 1, 2,...,k and S is Linearly Dependent

    Since S is linearly dependent

    a1v1 + a2v2 + ... akvk = 0
    such that there exists a nonzero coefficient

    then as you suggested, by taking T of both sides

    T(a1v1 + a2v2 + ... akvk) = T(0)

    and since T is linear

    a1 T(v1) + ... + ak T(vk) = 0 [since T(0)=0 ]

    then our other assumption, T(vi)=wi

    implies that

    a1w1 +...+ akwk = 0

    Since there existed a nonzero coefficient,
    that implies that

    {w1, w2, ... , wk} is Linearly DEPENDENT

    which contradicts the statement that {w1, w2, ... , wk} is Linearly Independent

    thus the original statement is true

    is this correct?
     
  5. Sep 17, 2007 #4
    I'm a beginner to proofs myself, but your proof looks correct to me. I dont think you needed to use contradiction.

    Here's my work, although I'm not 100% sure its correct either:

    suppose f(c1v1 + c2v2 + ... ckvk) = c1w1 + c2w2 + ... ckwk = 0
    Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

    As T is linear, f(0) = 0 since W is a vector space?

    Thus, c1v1 + c2v2 + ... ckvk = 0. From above, c1=c2=...ck=0. Thus v1, v2, ...vk is linearly independent
     
  6. Sep 17, 2007 #5

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    first, if you are given a linear tranformation, T, don't start talking about "f"!

    Second, while it is certainly true that T(0)= 0, it is NOT always true that if T(v)= 0, then we must have v= 0! And that's the way you need.
     
  7. Sep 17, 2007 #6

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    first, if you are given a linear tranformation, T, don't start talking about "f"!

    Second, while it is certainly true that T(0)= 0, it is NOT always true that if T(v)= 0, then we must have v= 0! The kernel of a linear transformation is not necessairily {0}. And that's the way you need.
     
  8. Sep 17, 2007 #7
    ok let me try it again:

    suppose c1v1 + c2v2 + ... ckvk = 0. Taking T of both sides, we obtain
    T(c1v1 + c2v2 + ... ckvk) = T(0) = 0 [since T(0) = 0 because T is linear?] Then
    c1w1 + c2w2 + ... ckwk = 0
    Then, since (w1, ..., wk) is linearly independent, c1=c2=...ck=0

    Thus, for c1v1 + c2v2 + ... ckvk = 0, with c1=c2=...ck=0, S = {v1,v2,...vk} must be a linearly independent set
     
  9. Sep 17, 2007 #8

    matt grime

    User Avatar
    Science Advisor
    Homework Helper


    If you don't understand whty T(0) is 0 then you should try to prove it.
     
  10. Sep 17, 2007 #9

    radou

    User Avatar
    Homework Helper

    Looks correct.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Linear Algebra: Linear Transformation and Linear Independence
Loading...