Show a Matrix w/a row of 0's cannot have Inverse

In summary, the matrix A cannot have an inverse and the determinant of the matrix is not always zero.
  • #1
Saladsamurai
3,020
7

Homework Statement



Show that a matrix with a row of zeros cannot have an inverse.

The Attempt at a Solution



I believe that I have to use the definition of invertible to do this.

If A is an nxn matrix and there exists some matrix B such that AB = BA = I
where I is the identity.

I would just like a hint here. Is there some other definition that I should be using here as well? I feel like I should be using the definition of matrix multiplication as well.

Thanks.
 
Physics news on Phys.org
  • #2
If the kth row of the matrix is zeroes, what can you say about the kth entry of any image vector?
 
  • #3
A square matrix is invertible if and only if its determinant is non-zero. If there is a row of zeros, what happens to the determinant of the matrix?
 
  • #4
Maybe I will start with the 2 x 2 case.

So I have:
[tex]
A =
\left[\begin{array} {cc}
a_{11}&a_{12}\\
0&0
\end{array}\right]
[/tex]

Now if I start by assuming that 'A' is invertible, then that implies that there exists some matrix 'B' such that AB=BA = I or

[tex]
\left[\begin{array} {cc}
a_{11}&a_{12}\\
0&0
\end{array}\right]\left[\begin{array} {cc}
b_{11}&b_{12}\\
b_{21}&b_{22}
\end{array}\right]
=
\left[\begin{array} {cc}
1&0\\
0&1
\end{array}\right]
[/tex]

[tex]\Rightarrow \left[\begin{array} {cc}
a_{11}b_{11}+a_{12}b_{21}&a_{11}b_{12}+a_{12}b_{22}\\
(0)(b_{11})+(0)(b_{21})&(0)b_{12}+(0)b_{22}
\end{array}\right]
=
\left[\begin{array} {cc}
1&0\\
0&1
\end{array}\right]

[/tex]

Setting corresponding entries equal to each other, clearly this last step is false, thus was my assumption that A was invertible.

Does that work?

EDIT: Thanks for the responses guys, but I have no idea what an Image vector is nor the determinant (well, I do know, but the text has not covered that yet so I know that I cannot use those definitions yet). All I have to go on is the definition of invertible.

But I think I got it.
 
  • #5
Ok, same concept. If A is your matrix, given any matrix B, for the kth row of AB, you have entries that are based on how the kth row of A multiplies with columns of B. What can you say about the kth row of B in this case?
 
  • #6
Office_Shredder said:
Ok, same concept. If A is your matrix, given any matrix B, for the kth row of AB, you have entries that are based on how the kth row of A multiplies with columns of B. What can you say about the kth row of B in this case?

I am not sure what you are saying. I don't see what can be said about the kth row of B?

What I do see is that since the kth row of AB is a linear combination of ALL of the rows of B with coefficients coming from the kth row of A that the kth row of AB must be all 0's and thus can never be the identity matrix.

Is that what you are trying to get out of me?

Thanks!
 
  • #7
Whoops, I meant the kth row of AB. Good job spotting the right thing to do anyway

But note that the kth row of AB is linear combinations of the columns of B, not the rows.
 
  • #8
Office_Shredder said:
Whoops, I meant the kth row of AB. Good job spotting the right thing to do anyway

But note that the kth row of AB is linear combinations of the columns of B, not the rows.

Err...I think you are mistaken. The rows of AB are linear combos of the rows of B.

The columns of AB are linear combos of the columns of A.

But whose counting anyway :wink:
 

What is a matrix with a row of 0's?

A matrix is a rectangular array of numbers or symbols arranged in rows and columns. A row of 0's in a matrix means that all the elements in that particular row are equal to 0.

Why can't a matrix with a row of 0's have an inverse?

In order to have an inverse, a matrix must be square, which means it has the same number of rows and columns. When a row of 0's is present in a matrix, the number of rows is not equal to the number of columns, making it impossible for the matrix to have an inverse.

What is an inverse of a matrix?

The inverse of a matrix A is another matrix denoted as A^-1, which when multiplied by A gives the identity matrix (a square matrix with 1's on the main diagonal and 0's elsewhere).

Can a matrix with a row of 0's have a pseudo-inverse?

Yes, a matrix with a row of 0's can have a pseudo-inverse. A pseudo-inverse is a generalization of the inverse for non-square matrices and is denoted as A^+. However, the pseudo-inverse is not unique and can have multiple solutions.

What is the significance of a matrix with a row of 0's?

A matrix with a row of 0's can represent a system of linear equations with no solution or an inconsistent system. It can also represent a dependent system of equations, where one of the equations is a linear combination of the others. In general, a row of 0's in a matrix can indicate that the matrix does not have an inverse.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
392
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Precalculus Mathematics Homework Help
Replies
25
Views
986
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
4K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
841
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
Back
Top