1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Linear algebra proof

  1. Feb 4, 2010 #1
    1. The problem statement, all variables and given/known data
    Suppose that an n x n matrix A is not invertible and that n is > or = 2. Prove that two vectors exist in R^n such that {v1, v2} is linearly independent and {Av1, Av2} is linearly dependent.


    2. Relevant equations

    Not sure

    3. The attempt at a solution

    I've been staring at this question for days, and I just can't seem to figure it out. Any help you could provide to point me in the right direction would be much appreciated!
     
  2. jcsd
  3. Feb 4, 2010 #2

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    If A is not invertible, then there is a nonzero vector in it's kernel.
     
  4. Feb 6, 2010 #3
    Is there a way to prove it without using kernel? Haven't learned it.

    I was thinking that because A is not invertible, it is not row equivalent to the Identity matrix, then when you row reduce it into echelon form, there is at least one row of all zeroes. So you can pick v1 and v2 such that they have all the same entries except for the last entry and not all entries are zeroes. So v1 and v2 are linearly independent because they are not multiple of each other. When you multiply by echelon form of A, you get two identical matrix because the last different entries become 0.

    But then I only showed that echelon(A)(v1) and echelon(A)(v2) are dependent. Is there anyway to relate this to {Av1, Av2}... I probably went off on the wrong track.
     
  5. Feb 7, 2010 #4
    To rephrase Dick's suggestion: multiplication by A defines a function from R^n to R^n. The matrix A is invertible if and only if the function is invertible. Turns out the only way that this function can fail to be invertible is if it is not 1-1. This is exactly what you are trying to prove.

    What do you know about invertible matrices so far?

    You could probably get your current approach to work by modifying your vectors v1 and v2. Specifically, you need to undo all of the row operations you performed on your matrix to get it into row echelon form. Do you know that a row operation on a matrix corresponds to left multiplication by an invertible "elementary" matrix?
     
  6. Feb 7, 2010 #5
    Ok from your suggestion,

    Let E1, E2, ... , Ep be the series of row operation that reduce A into echelon form, then I would pick (Ep...E2E1)^-1(v1) and (Ep...E2E1)^-1(v2) for my linearly independent set, but I am not quite sure how to prove that the vectors I chose are linearly independent.

    So I try the 1-1 approach,

    Let {v1, v2} be linearly independent, then we can prove {Av1, Av2} to be linearly dependent if there exist some c1 and c2 not both are zeroes such that

    c1(Av1) + c2(Av2) = 0

    Since A is a matrix, the transformation defined by x|-->Ax is a linear transformation and thus we get

    A(c1v1) + A(c2v2) = 0
    A(c1v1 + c2v2) = 0

    Since A is not invertible, the transformation is not 1-1, and A(c1v1 + c2v2) = 0 has more than the trivial solution of c1v1 + c2v2 = 0.

    Since there are some vector (c1v1 + c2v2) =/= 0, not both of c1 and c2 are zeroes. Thus it is proven that {Av1, Av2} is a linearly dependent set.
     
    Last edited: Feb 7, 2010
  7. Feb 7, 2010 #6
    Did you already establish this in class? Because it does require a proof! (A general map can also fail to be invertible if it is not onto. For linear maps from R^n to R^n, being 1-1 and onto happen to be equivalent conditions, but that requires a proof.)

    How do you know that A fails to be 1-1 at zero? (It is true, but have you proved this already if you have not talked about kernels yet?)

    Why are v1 and v2 linearly independent? This does not work for two arbitrary vectors v1 and v2. How do you choose them?

    At this point, your row echelon approach seems more fruitful.

    Suppose for a contradiction that (Ep...E2E1)^-1(v1) and (Ep...E2E1)^-1(v2) are linearly dependent. Then there exist scalars a and b such that 0 = a (Ep...E2E1)^-1(v1) + b (Ep...E2E1)^-1(v2) = (Ep...E2E1)^-1 (a v1 + b v2) = 0. What can you conclude? (Hint: what do you know about (Ep...E2E1)^-1 ?)
     
  8. Feb 8, 2010 #7
    Yes, this is part of the Invertible Matrix Theorem, which is already proved.


    Yes, there is also a theorem and proof on this. Sorry for not mentioning them.

    I'm a little confused here. I thought that it might not work for any two arbitrary vectors because of the way the problem stated "there exists". But I don't really understand why I can't pick two arbitrary vectors that are already linearly independent. So basically any two vectors where neither one is a multiple of the other and show that after the transformation is applied, there are some constant c1 and c2 (not both zeroes) such that c1(Av1) + c2(Av2) = 0. Then the two linearly independent vectors that I picked are dependent vectors under the transformation.

    At this point, there got to be some flaws in my failed proof because according to what I said, any two linearly independent vectors will become dependent after the transformation. However, I still don't see it! D:
     
  9. Feb 8, 2010 #8
    (Ep...E2E1)^-1 is invertible (product of elementary matrices is invertible, theorem) so
    (Ep...E2E1)^-1 (a v1 + b v2) = 0 has only the trivial solution (by a theorem in my book). Hence, a v1 + b v2 = 0 is the only solution.

    I still don't see why v1 and v2 have to be linearly independent at this point because v1 and v2 could be non-zeros vectors where v1=v2, then a=-b, would still give a zero vector for (a v1 + b v2)

    Hmm, just realized that I could pick v1 and v2, hmm, let me try this again.
     
  10. Feb 8, 2010 #9
    Pick two vectors in R^n v1 and v2 such that they have all the same entries except for the last entry and not all entries are zeroes. So v1 and v2 are linearly independent because they are not multiple of each other.

    Let E1, E2, ... , Ep be the series of row operation that reduce A into echelon form, then (Ep...E2E1)^-1 is the matrix that will transform echelon form A back into A.

    My original argument use the assumption that my two vectors v1 and v2 are different in their last entry so echelon A will transform these last entries into 0 and make v1 and v2 identical. However, when I apply (Ep...E2E1)^-1 to v1 and v2, the different entries in v1 and v2 may no longer be the last entries. So it doesn't help with what I was trying to do anymore.

    I feel like this proof should be really simple, but I'm missing some obvious theorem or something. :(
     
  11. Feb 8, 2010 #10
    Remember, you are in the middle of a proof by contradiction. You picked v1 and v2 to be linearly independent! So if av1 + bv2 = 0, it follows that a = b = 0. We just showed that a (Ep...E2E1)^-1(v1) + b (Ep...E2E1)^-1(v2) = 0 implies a = b = 0. That shows that (Ep...E2E1)^-1(v1) and (Ep...E2E1)^-1(v2) are two independent vectors.

    I am starting to think that we should probably have used column operations instead of row operations. Think about what we wanted to happen: Let A denote the matrix and M the reduced (row or column, TBD) echelon form. We have two linearly independent vectors v1 and v2 with M*v1 and M*v2 dependent.

    How do we go from there to independent vectors w1 and w2 with A*w1 and A*w2 dependent? It would be really easy if A = M Ep...E1, w1 = (Ep...E1)^-1 v1 and w2 = (Ep...E1)^-1 v2. Then A*w1 = M (Ep...E1)*(Ep...E1)^-1 v1 = M v1, which is exactly what we wanted!

    This should complete the proof if you work with column echelon forms rather than row echelon forms.
     
  12. Feb 8, 2010 #11
    Here is an example that shows that you have to be careful to pick the right independent vectors. Take a linear map f: R^3 -> R^3 that sends
    (1,0,0) -> (2,0,0)
    (0,1,0) -> (0,2,0)
    (0,0,1) -> (0,2,0) <- not a typo

    (1,0,0) and (0,1,0) are two linearly independent vectors and they are also sent to two linearly independent vectors. (0,1,0) and (0,0,1), on the other hand, are two linearly independent vectors which are sent to dependent vectors (in fact, to the same vector!).
     
  13. Feb 8, 2010 #12
    I guess column echelon form will work here, but I don't know if I am allowed to use column echelon form because we have not define that in class and the book hasn't mention it.
     
  14. Feb 8, 2010 #13
    If you already know that a map that is not invertible has nontrivial kernel, you are almost done. Suppose f(w) = f(0) = 0. Take any vector v that is linearly independent from w. Then f(v+w) = f(v)+f(w) = f(v). So f(v+w) and f(v) are clearly linearly dependent. Can you show that (v+w) and v are linearly independent?
     
  15. Feb 8, 2010 #14
    I guess what I learned is the same as kernel but defined differently (without using the term kernel).

    Theorem: Let T: R^n --> R^m be a linear transformation. Then T is one-to-one if and only if the equation T(x)=0 has only the trivial solution.

    Invertible Matrix Theorem: All statements are equivalent,
    a) A is an invertible matrix.
    f) The linear transformation x |-->Ax is one-to-one.

    So A is not invertible, so the linear transformation T defined by x |-->Ax is not 1-1.
    Then T(x)=0 has more than the trivial solution.

    If I can prove that (v+w) and v are linearly independent, then the proof is done. Here w is fixed because f(w) = f(0) = 0, but w is non-zero. So why do I have to prove that (v+w) and v are linearly independent instead of choosing a vector v that is identical to w, but different by at least one entry. Then are they not linearly independent?
     
  16. Feb 8, 2010 #15
    Whatever works for you is fine - it does not matter how you establish that (v+w) and v are linearly independent, as long as they are.
     
  17. Feb 8, 2010 #16
    I guess the proof was simple after all. THANK YOU for your help :D
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook