1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Row redcing a matrix to determine linear dependence?

  1. Mar 14, 2012 #1
    1. The problem statement, all variables and given/known data

    Determine if vectors

    <1,-1,0,2,3>
    <1,0,-1,3,3>
    <1,-1,0,3,0>
    <0,1,-1,2,-2>

    are linearly dependent or independent


    2. Relevant equations

    I have been solving these questions in the book using a matrix and row reducing them. If I wound up with a free variable, I determined the vectors to be linearly dependent. If there was no free variable after row reducing, then I determined the vectors were linearly independent. From the examples in the book, this method worked (unless just by chance?)


    3. The attempt at a solution

    This is the problem that appeared on the first exam. I solved as described, and found a free variable; the answer was marked wrong. I thought the existence of a parameter in a solution set meant that at least two of the vectors were in the same span, therefore linearly dependent. Am I mistaken?
     
  2. jcsd
  3. Mar 14, 2012 #2

    Mark44

    Staff: Mentor

    Did you write your vectors as columns in the matrix you row reduced? If so, you will get at least one row of zeroes in the reduced matrix. If there was only one row, the vectors are linearly independent. If there were two or more rows of zeroes, then the vectors are linearly dependent.

    If we let v1, v2, v3, and v4 represent your four vectors of a matrix A, row reducing A is equivalent to solving the equation c1v1 + c2v2 + c3v3 + c4v4 = 0. If the row-reduced matrix has only a single row of zeroes, then the constants are equal to zero, so the vectors are linearly independent. If the row-reduced matrix has two or more rows of zeroes, then at least one of the constants is nonzero, so the vectors are linearly dependent.
     
  4. Mar 14, 2012 #3
    Thanks Mark44 for the reply. I did express the vectors as columns. But in the book, there is no part that says you must have two rows of zeroes for dependence.

    Example: (assume these are columns)

    <1,2,3>
    <4,5,6>
    <2,1,0>

    Determine if the set is linearly dependent.

    Solution:

    "Clearly, x1 and x2 are basic variables, and x3 is free. Each nonzero value of x3 determines a nontrivial solution of (1). Hence, vector set is linearly dependent (and not linearly independent)."

    Am I reading this wrong?
     
  5. Mar 14, 2012 #4
    If you set your vectors as rows, as you wrote them in the opening post, then Gaussian elimination will show whether they are dependent or independent in the expected way.
     
  6. Mar 14, 2012 #5
    Thanks Joffan, haven't heard that before (undergrad)...

    Mark44, appreciate you specifically saying two rows are needed...you are of course correct... and I think I need a different book
     
  7. Mar 14, 2012 #6

    Mark44

    Staff: Mentor

    lonewolf219,
    In your example in post #3, you have a 3 x 3 matrix. If you row reduce and find a row of zeroes, the vectors are linearly dependent, for reasons I gave earlier.

    In your original example in post #1, the matrix whose columns are those vectors will be a 5 x 4 matrix (i.e., 5 rows and 4 columns). Such a matrix has to row reduce to have at least 1 row of zeroes. If there is just 1 zero row, the vectors are lin. independent. If there are 2 or more zero rows, the vectors are lin. dependent.
     
  8. Mar 14, 2012 #7
    Yes, I see the difference... indeed, the example the book gives is correct. Thanks for pointing that out.

    Thanks for your help, Mark 44!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook