Show a Matrix w/a row of 0's cannot have Inverse

1. Oct 12, 2009

1. The problem statement, all variables and given/known data

Show that a matrix with a row of zeros cannot have an inverse.

3. The attempt at a solution

I believe that I have to use the definition of invertible to do this.

If A is an nxn matrix and there exists some matrix B such that AB = BA = I
where I is the identity.

I would just like a hint here. Is there some other definition that I should be using here as well? I feel like I should be using the definition of matrix multiplication as well.

Thanks.

2. Oct 12, 2009

Office_Shredder

Staff Emeritus
If the kth row of the matrix is zeroes, what can you say about the kth entry of any image vector?

3. Oct 12, 2009

n!kofeyn

A square matrix is invertible if and only if its determinant is non-zero. If there is a row of zeros, what happens to the determinant of the matrix?

4. Oct 12, 2009

Maybe I will start with the 2 x 2 case.

So I have:
$$A = \left[\begin{array} {cc} a_{11}&a_{12}\\ 0&0 \end{array}\right]$$

Now if I start by assuming that 'A' is invertible, then that implies that there exists some matrix 'B' such that AB=BA = I or

$$\left[\begin{array} {cc} a_{11}&a_{12}\\ 0&0 \end{array}\right] \left[\begin{array} {cc} b_{11}&b_{12}\\ b_{21}&b_{22} \end{array}\right] = \left[\begin{array} {cc} 1&0\\ 0&1 \end{array}\right]$$

$$\Rightarrow \left[\begin{array} {cc} a_{11}b_{11}+a_{12}b_{21}&a_{11}b_{12}+a_{12}b_{22}\\ (0)(b_{11})+(0)(b_{21})&(0)b_{12}+(0)b_{22} \end{array}\right] = \left[\begin{array} {cc} 1&0\\ 0&1 \end{array}\right]$$

Setting corresponding entries equal to each other, clearly this last step is false, thus was my assumption that A was invertible.

Does that work?

EDIT: Thanks for the responses guys, but I have no idea what an Image vector is nor the determinant (well, I do know, but the text has not covered that yet so I know that I cannot use those definitions yet). All I have to go on is the definition of invertible.

But I think I got it.

5. Oct 12, 2009

Office_Shredder

Staff Emeritus
Ok, same concept. If A is your matrix, given any matrix B, for the kth row of AB, you have entries that are based on how the kth row of A multiplies with columns of B. What can you say about the kth row of B in this case?

6. Oct 12, 2009

I am not sure what you are saying. I don't see what can be said about the kth row of B?

What I do see is that since the kth row of AB is a linear combination of ALL of the rows of B with coefficients coming from the kth row of A that the kth row of AB must be all 0's and thus can never be the identity matrix.

Is that what you are trying to get out of me?

Thanks!

7. Oct 12, 2009

Office_Shredder

Staff Emeritus
Whoops, I meant the kth row of AB. Good job spotting the right thing to do anyway

But note that the kth row of AB is linear combinations of the columns of B, not the rows.

8. Oct 13, 2009