Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear dependence of two vectors

  1. Jun 23, 2015 #1

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    Suppose the vectors ##v_a## and ##v_b## are linearly independent, another vector ##v_c## is linearly dependent to both ##v_a## and ##v_b##. Now if I form a new vector ##v_d##, where ##v_d = v_b+cv_c## with ##c## a constant, will ##v_d## be linearly independent to ##v_a##?
    I need to check how I can satisfy the equation
    $$C_1 v_a + C_2 v_d = 0$$
    If that's only possible when ##C_1 = C_2 = 0##, then ##v_a## and ##v_d## are linearly independent, but I don't think that's necessarily the case.
     
  2. jcsd
  3. Jun 23, 2015 #2

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Do you mean ##v_c## is linearly independent to both ##v_a## and ##v_b##?

    If so, it's clear that ##v_a## and ##v_d## are linearly independent.
     
  4. Jun 23, 2015 #3

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    In general I want ##v_c## to be arbitrary vector (in the same vector space as ##v_a## and ##v_b##).
    Actually this question arose when I read about the rank of a matrix. In the book I read, it says that when a row of a matrix ##A##, let's say ##v_i## is replaced by ##v_i + cv_j## where ##c## constant and ##v_j## another row of the same matrix, then the rank of the newly formed matrix (after replacement) is the same as that of ##A##. That's the same as saying that the number of linearly independent rows in the new matrix after the replacement is the same as the old matrix ##A##. I'm trying to prove that it's true.
     
    Last edited: Jun 23, 2015
  5. Jun 23, 2015 #4

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    In that case, you can consider the span of ##v_i## and ##v_j## and the span of ##v_i + cv_j## and ##v_j##.

    These are equal, so the span of the rows is unchanged.
     
  6. Jun 23, 2015 #5

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    But ##v_i## and ##v_j## can be arbitrary pair of rows, they need not belong to the linearly independent sets of vectors which determine the rank of A. In the case when they are not linearly independent, does it also make sense to say that they span some space?
     
  7. Jun 23, 2015 #6

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Yes, absolutely. There are lots of ways to prove this for arbitrary vectors. One way is simply to equate any linear combination in each case. Another is to think of the two vectors in terms of the basis vectors needed to represent them: for whatever basis you choose.
     
  8. Jun 23, 2015 #7

    WWGD

    User Avatar
    Science Advisor
    Gold Member

    Yes, you can prove this by induction. If ##v_1, v_2## are independent, then show that the set { ## av_1+bv_2##} is a subspace, for ##a,b## in the base field ##K##. Now, you can generalize to ##n ## vectors.
     
  9. Jun 24, 2015 #8

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Any set of vectors "span some space". If they happen to be linearly independent, then they form a basis for that space. If they are dependent, then some vectors in the set can be written as a linear combination of the others. Dropping those vectors from the set will give a basis for the space.
     
  10. Jun 24, 2015 #9

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    Ok I think this is caused by my lack of understanding of the definition of the span of a set of vectors. I just found both in wikipedia and my textbook that
    "The set of all linear combinations of ##(v_1, \dots , v_m)## is called the span of ##(v_1, \dots , v_m)##"
    There it doesn't mention that all vectors in that list must be linearly independent. Now I agree with what you said there.

    But then what does it imply with the unchanging of the rank of A if the span of its rows remains the same? Does it automatically imply that the number of linearly independent rows in A also stays the same?
     
  11. Jun 24, 2015 #10

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The row rank of a matrix is the dimension of the span of of its rows = the number of linearly independent rows = the number of basis vectors needed to represent the span of all the rows.

    You can prove the result you want using any of these equivalent properties. But, perhaps using the last is the easiest, since:

    The span of ##v_i + cv_j## and ##v_j## requires precisely the same set of basis vectors to represent them as the span of ##v_i## and ##v_j## and this doesn't affect the basis vectors required to represent the span of the other row vectors. Hence, the row rank is unchanged by this row operation.

    PS in the end, however, you prove it, it all hinges on the fact that all linear combinations of ##v_i + cv_j## and ##v_j## are linear combinations of ##v_i## and ##v_j## and vice versa.
     
    Last edited: Jun 24, 2015
  12. Jun 24, 2015 #11

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    Thanks that really helps.
     
  13. Jun 24, 2015 #12

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    Does this mean that I can represent any row as a linear combination of the linearly independent rows?
     
  14. Jun 24, 2015 #13

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    Consider the simplest case, va=(1,0); vb=(0,1).
    Define vc = va - vb and vd = vb + 1*vc = va.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Linear dependence of two vectors
Loading...