1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Determining linear independence

  1. Mar 15, 2012 #1
    1. The problem statement, all variables and given/known data

    If set A={u,v,w} ⊂ R^n is linearly independent, is B={u-v, u+w, v+w}⊂ R^n linearly independent?


    2. Relevant equations


    3. The attempt at a solution

    Since A is linearly independent, there exist no all non-zero scalars a1, a2, a3 such that a1*u+a2*v+a3*w=0. Or I can say that since A is linearly independent, a1=a2=a3=0. Then to determine whether B is linearly indepedent, I think I need to determine whether b1=b2=b3=0 for b1*(u-v)+b2*(u+w)+b3*(v+w)=0. But if I do so, I find no connection between a1, a2, a3 and b1, b2 ,b3. How can I complete this proof?
     
    Last edited: Mar 15, 2012
  2. jcsd
  3. Mar 15, 2012 #2

    Mark44

    Staff: Mentor

    Yes, solve the equation above for b1, b2, and b3. Your will need to use the fact that a1*u + a2*v + a3*w = 0 has only the nontrivial solution for the three constants. That's the connection you're asking about, below.
     
  4. Mar 15, 2012 #3
    Ok, I tried to solve the equation and it goes like this:

    if u=(u1, u2, ....un), v=(v1, v2, ...vn), w=(w1,w2...wn)
    then b1(u-v)+b2(u+w)+b3(v+w)=0 becomes

    b1(u1-v1)+b2(u1+w1)+b3(v1+w1)=0
    b1(u2-v2)+b2(u2+w2)+b3(v2+w2)=0
    .
    .
    .
    b1(un-vn)+b2(un+wn)+b3(vn+wn)=0

    Then how should i find b1, b2 and b3? Also I do not quite get what you means by 'Your will need to use the fact that a1*u + a2*v + a3*w = 0 has only the nontrivial solution for the three constants'
     
  5. Mar 15, 2012 #4

    Mark44

    Staff: Mentor

    You don't need all of this. Starting by assuming that b1(u-v)+b2(u+w)+b3(v+w)=0. Now write this equation as a linear combination of u, v, and w. (IOW, as a sum of constant multiples of u, v, and w.)

    From that equation you should be able to argue that u - v, u + w, and v + w are also linearly independent.

     
  6. Mar 16, 2012 #5
    Maybe I did something wrong, correct me please if I did, but I think B is linearly dependent. A linear dependence relationship would be b1=1, b2=-1, b3=1, or any multiple of it. I uploaded a pdf file showing how I found this. The way this problem was worded led me to think B should be linearly independent, or I would've done it quicker.
     

    Attached Files:

  7. Mar 16, 2012 #6

    Mark44

    Staff: Mentor

    Yes, that shows that the vectors in B are a linearly dependent set.
     
  8. Mar 16, 2012 #7

    epenguin

    User Avatar
    Homework Helper
    Gold Member

    I don't think it's showing off but I took one look at that and thought hey second element minus first equals third, there is linear dependence without any if coming into it. IOW I think your conclusion is right.
     
  9. Mar 16, 2012 #8
    Exactly. Normally all of these linear independence problems can be solved very quickly just by looking some seconds at them and trying to figure out whether there are obvious combinations. If no combinations are seen, then start your calculations.

    I personally think the easiest way to solve these problems is calculating the determinant of a matrix whose columns are the vectors. I shall explain this better:

    A given vector (u,v,w) is an element of a tri-dimensional vector space (for example ℝ^3). A "basis" of this vector space is C= {(u,0,0);(0,v,0);(0,0,w)} = { e1, e2, e3 }

    Now we are given a set of vectors B= {u-w,u+v,v+w} which can also be expressed this way: B= { (e1-e3), (e1+e2), (e2+e3) } or, as a combination of vectors of C: B={ (1,0,-1), (1,1,0), (0,1,1)}

    Now take the last three vectors and form a matrix with them, columns or rows doesn´t matter (if the determinant of a given matrix is ≠ 0, then the determinant of its transpose also is ≠ 0). If M is your matrix, calculate det(M) and you will get det(M)=0, which means that the vectors forming that matrix are linearly dependant. If det(M)≠0, then they would be linearly independant.

    Related to this is the concept of "rank". The rank of a given matrix indicates how many vectors (rows or columns) are linearly independant. If your matrix is size "n x n" and its determinant is unequal to cero, then its rank would be "n". In your problem rank(M)=2, meaning that there are only 2 independant vectors in set B.

    Knowing the rank of a set of vectors is very useful because it tells you whether you have "repeated information" or not. For example, if you have a system of 10 equations, assimilate each equation to a vector, and after calculating it turns out that the rank of the matrix formed by them is only 4, that means that you can throw away 6 equations and end up with an equivalent system (meaning that the solutions of the first and the ones of the second system are exactly the same). Of course you have to carefully select the equations you throw away, making sure that they are a linear combination of the equations you already have.

    Greetings!
     
  10. Mar 17, 2012 #9
    I see!
     
    Last edited: Mar 18, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook