1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Show a Matrix w/a row of 0's cannot have Inverse

  1. Oct 12, 2009 #1
    1. The problem statement, all variables and given/known data

    Show that a matrix with a row of zeros cannot have an inverse.

    3. The attempt at a solution

    I believe that I have to use the definition of invertible to do this.

    If A is an nxn matrix and there exists some matrix B such that AB = BA = I
    where I is the identity.

    I would just like a hint here. Is there some other definition that I should be using here as well? I feel like I should be using the definition of matrix multiplication as well.

    Thanks.
     
  2. jcsd
  3. Oct 12, 2009 #2

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    If the kth row of the matrix is zeroes, what can you say about the kth entry of any image vector?
     
  4. Oct 12, 2009 #3
    A square matrix is invertible if and only if its determinant is non-zero. If there is a row of zeros, what happens to the determinant of the matrix?
     
  5. Oct 12, 2009 #4
    Maybe I will start with the 2 x 2 case.

    So I have:
    [tex]
    A =
    \left[\begin{array} {cc}
    a_{11}&a_{12}\\
    0&0
    \end{array}\right]
    [/tex]

    Now if I start by assuming that 'A' is invertible, then that implies that there exists some matrix 'B' such that AB=BA = I or

    [tex]
    \left[\begin{array} {cc}
    a_{11}&a_{12}\\
    0&0
    \end{array}\right]


    \left[\begin{array} {cc}
    b_{11}&b_{12}\\
    b_{21}&b_{22}
    \end{array}\right]
    =
    \left[\begin{array} {cc}
    1&0\\
    0&1
    \end{array}\right]
    [/tex]

    [tex]\Rightarrow \left[\begin{array} {cc}
    a_{11}b_{11}+a_{12}b_{21}&a_{11}b_{12}+a_{12}b_{22}\\
    (0)(b_{11})+(0)(b_{21})&(0)b_{12}+(0)b_{22}
    \end{array}\right]
    =
    \left[\begin{array} {cc}
    1&0\\
    0&1
    \end{array}\right]

    [/tex]

    Setting corresponding entries equal to each other, clearly this last step is false, thus was my assumption that A was invertible.

    Does that work?

    EDIT: Thanks for the responses guys, but I have no idea what an Image vector is nor the determinant (well, I do know, but the text has not covered that yet so I know that I cannot use those definitions yet). All I have to go on is the definition of invertible.

    But I think I got it.
     
  6. Oct 12, 2009 #5

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Ok, same concept. If A is your matrix, given any matrix B, for the kth row of AB, you have entries that are based on how the kth row of A multiplies with columns of B. What can you say about the kth row of B in this case?
     
  7. Oct 12, 2009 #6
    I am not sure what you are saying. I don't see what can be said about the kth row of B?

    What I do see is that since the kth row of AB is a linear combination of ALL of the rows of B with coefficients coming from the kth row of A that the kth row of AB must be all 0's and thus can never be the identity matrix.

    Is that what you are trying to get out of me?

    Thanks!
     
  8. Oct 12, 2009 #7

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Whoops, I meant the kth row of AB. Good job spotting the right thing to do anyway

    But note that the kth row of AB is linear combinations of the columns of B, not the rows.
     
  9. Oct 13, 2009 #8
    Err...I think you are mistaken. The rows of AB are linear combos of the rows of B.

    The columns of AB are linear combos of the columns of A.

    But whose counting anyway :wink:
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook