Show a Matrix w/a row of 0's cannot have Inverse

Saladsamurai
Messages
3,009
Reaction score
7

Homework Statement



Show that a matrix with a row of zeros cannot have an inverse.

The Attempt at a Solution



I believe that I have to use the definition of invertible to do this.

If A is an nxn matrix and there exists some matrix B such that AB = BA = I
where I is the identity.

I would just like a hint here. Is there some other definition that I should be using here as well? I feel like I should be using the definition of matrix multiplication as well.

Thanks.
 
Physics news on Phys.org
If the kth row of the matrix is zeroes, what can you say about the kth entry of any image vector?
 
A square matrix is invertible if and only if its determinant is non-zero. If there is a row of zeros, what happens to the determinant of the matrix?
 
Maybe I will start with the 2 x 2 case.

So I have:
<br /> A =<br /> \left[\begin{array} {cc}<br /> a_{11}&amp;a_{12}\\<br /> 0&amp;0<br /> \end{array}\right]<br />

Now if I start by assuming that 'A' is invertible, then that implies that there exists some matrix 'B' such that AB=BA = I or

<br /> \left[\begin{array} {cc}<br /> a_{11}&amp;a_{12}\\<br /> 0&amp;0<br /> \end{array}\right]\left[\begin{array} {cc}<br /> b_{11}&amp;b_{12}\\<br /> b_{21}&amp;b_{22}<br /> \end{array}\right]<br /> =<br /> \left[\begin{array} {cc}<br /> 1&amp;0\\<br /> 0&amp;1<br /> \end{array}\right]<br />

\Rightarrow \left[\begin{array} {cc}<br /> a_{11}b_{11}+a_{12}b_{21}&amp;a_{11}b_{12}+a_{12}b_{22}\\<br /> (0)(b_{11})+(0)(b_{21})&amp;(0)b_{12}+(0)b_{22}<br /> \end{array}\right]<br /> =<br /> \left[\begin{array} {cc}<br /> 1&amp;0\\<br /> 0&amp;1<br /> \end{array}\right]<br /> <br />

Setting corresponding entries equal to each other, clearly this last step is false, thus was my assumption that A was invertible.

Does that work?

EDIT: Thanks for the responses guys, but I have no idea what an Image vector is nor the determinant (well, I do know, but the text has not covered that yet so I know that I cannot use those definitions yet). All I have to go on is the definition of invertible.

But I think I got it.
 
Ok, same concept. If A is your matrix, given any matrix B, for the kth row of AB, you have entries that are based on how the kth row of A multiplies with columns of B. What can you say about the kth row of B in this case?
 
Office_Shredder said:
Ok, same concept. If A is your matrix, given any matrix B, for the kth row of AB, you have entries that are based on how the kth row of A multiplies with columns of B. What can you say about the kth row of B in this case?

I am not sure what you are saying. I don't see what can be said about the kth row of B?

What I do see is that since the kth row of AB is a linear combination of ALL of the rows of B with coefficients coming from the kth row of A that the kth row of AB must be all 0's and thus can never be the identity matrix.

Is that what you are trying to get out of me?

Thanks!
 
Whoops, I meant the kth row of AB. Good job spotting the right thing to do anyway

But note that the kth row of AB is linear combinations of the columns of B, not the rows.
 
Office_Shredder said:
Whoops, I meant the kth row of AB. Good job spotting the right thing to do anyway

But note that the kth row of AB is linear combinations of the columns of B, not the rows.

Err...I think you are mistaken. The rows of AB are linear combos of the rows of B.

The columns of AB are linear combos of the columns of A.

But whose counting anyway :wink:
 
Back
Top