Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Problem on linear independence and matrices

  1. Jan 7, 2008 #1
    Can I ask for some help?

    Suppose that {v1,v2...vn} is a linearly independent set of vectors and A is a singular matrix.
    Prove or disprove: The set {Av1, Av2, ...Avn} is linearly independent.
  2. jcsd
  3. Jan 7, 2008 #2
    How can you relate two different basis?
  4. Jan 7, 2008 #3
    the vectors are the same. and they're not bases, they're just sets of vectors
  5. Jan 7, 2008 #4
    But the vetors of the first set are linearly independent, so they form a basis. If the second set has linearly independent members it should also be a base. Correct?
  6. Jan 7, 2008 #5
    but doesnt the singular matrix change the property of linear independence?
  7. Jan 7, 2008 #6
    So the members of the second set can not be independent! :smile:
  8. Jan 7, 2008 #7
    hmm...i dont quite get it. can you write a rough proof. :) thanks a lot!
  9. Jan 7, 2008 #8
    The first set forms a basis since the vectors are linearly independent.

    Suppose now that the members of the 2nd set are linearly independent, thus they also form a basis. Two basis are related by a non-singular matrix, but in our case they related by a singular one. Thus the members of the 2nd are not linearly independent.

    What about that?
  10. Jan 7, 2008 #9
    how does the singular matrix change the linear independence of the basis?
  11. Jan 7, 2008 #10
    Let me show it with equations.

    Call the vectors of the 2nd set [itex]\bar{v}^\alpha[/itex], then

    [tex] \bar{v}^\alpha=A^\alpha_\beta\,v^\beta[/tex]

    In order for [itex]\bar{v}^\alpha[/itex] to be linearly independent, it must hold that

    [tex]\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_{\alpha}=0 \quad \forall \alpha[/tex]

    Now we have

    [tex]\lambda_\alpha \, \bar{v}^\alpha=0\Rightarrow \lambda_\alpha \, A^\alpha_\beta\,v^\beta=0 \Rightarrow \lambda_\alpha \, A^\alpha_\beta=0 [/tex]

    The last equality hols because [itex]v^\alpha[/tex] are linearly independent.

    This is a [itex]n\times n[/itex] homogeneous system for the unknows [itex]\lambda_\alpha [/itex]. In order for that to have only the trivial solution it must hold [itex]det(A)\neq 0[/itex]. But [itex]det(A)= 0[/itex] so there is a solution for [itex]\lambda_\alpha [/itex] besides the trivial one. Thus the vectors [itex]\bar{v}^\alpha[/itex] are dependent.
    Last edited: Jan 7, 2008
  12. Jan 7, 2008 #11
    if there is a trivial solution, what does it imply? :) thanks btw.
  13. Jan 7, 2008 #12


    User Avatar
    Staff Emeritus
    Science Advisor

    How about just trying for a counter example? What is the simplest singular linear transformation you know?
  14. Jan 7, 2008 #13
    Trivial solution means [tex]\lambda^\alpha=0[/tex], which would cause [tex]\bar{v}^\alpha[/tex] to be independent
  15. Jan 17, 2008 #14

    I hope I am not spoiling the fun, but I think things are getting confused.

    Anything could happen here. Since we are not requiring that {v_1, ..., v_n} is a basis, here are two examples. Take any set of linearly independent vectors and let A = 0. Then of course their images are not linearly independent. On the other hand, take A to be some nonzero matrix, and let v_1 be any vector not in its kernel. Then {v_1} is linearly independent, and so is {Av_1}.
  16. Jan 17, 2008 #15
    The OP was:

    Since the [itex]v_i[/itex] are linearly independent, then they form a basis. Assuming of course, the dimension of the vector space is n. :smile:
  17. Jan 17, 2008 #16


    User Avatar
    Staff Emeritus
    Science Advisor

    All you really need is a 'counterexample'. If {v1, v2, ..., vn} is a set of independent vectors, and A is the linear transformation that takes every v into 0, what can you say about {Av1, Av2, ..., Avn}?
  18. Jan 17, 2008 #17


    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    whats your definition of singular? if it means the columns are not independent, you are done. at least assuming you know the basic theory of dimension.
  19. Jan 18, 2008 #18


    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, and masnevets' point was that there is no reason to assume that! In any case, his point was an extension of what I said: Suppose {v1, v2, ...} is a set of independent vectors and A is the zero operator (Av= 0 for all v).
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Problem on linear independence and matrices
  1. Matrices problem (Replies: 2)

  2. Linear Independence (Replies: 8)