1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra - Proofs involving Inverses

  1. Jan 21, 2009 #1
    Two fairly simple proof problems. . . why aren't they simpler? :(
    1. The problem statement, all variables and given/known data
    Let A be an nxn matrix...
    If A is not invertible then there exists an nxn matrix B such that AB = 0, B != 0. (not equal to)


    2. Relevant equations
    None really.


    3. The attempt at a solution
    Obviously, when A is the zero matrix, AB = 0.

    If we call A the coefficient matrix in the system of equations Ax = 0, then x = x1B1 + x2B2 + ... + xnBn, where B = [B1|B2|...|Bn].

    I can't seem to explain why that works. Is it obvious enough just to say that or is there a step of explanation I have left out?

    --------------------------------------------------------------------------------

    1. The problem statement, all variables and given/known data
    If A is an m x n matrix, B is an n x m matrix and n < m, then AB is not invertible.


    2. Relevant equations



    3. The attempt at a solution
    Obviously, A and B are not square and are therefore not invertible. Does that fact really matter? The product of invertible matrices is invertible, but is the product of non invertible matrices also non invertible?
     
  2. jcsd
  3. Jan 21, 2009 #2
    Well, what i would say is that: We know that an invertible matrix is nonsingular,moreover, a matrix is invertible if and only if it is non-singular. So, since here A is supposed to be non-invertible, it means that A is singular. What this means is that: There exists some non-zero vector, call it b such that Ab=0.
    Extrapolating from this, we can argue that, there will be some non-zero vectors, call them

    [tex]\arrow B_1, B_2,...B_n[/tex] such that [tex] A*B_i=0[/tex], for all i=1,2,....n.

    So, if we built our matrix [tex] B=[B_1,B_2,...,B_n][/tex]

    We have actuall proven that AB=0. Where as we can clearly see B is not the zero matrix.
     
  4. Jan 21, 2009 #3
    How can I guarrantee that B is not zero?
     
  5. Jan 21, 2009 #4

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You know there is a vector Ab=0 with b not equal to zero, as sutupidmath said. You also seem to know the column space of a matrix represents it's range. So build a matrix whose column space is only multiples of b.
     
  6. Jan 21, 2009 #5
    Thanks everyone. Think I have it now.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Linear Algebra - Proofs involving Inverses
Loading...