If AB is invertible, then A and B are invertible.

  • Context: Graduate 
  • Thread starter Thread starter Bipolarity
  • Start date Start date
Click For Summary
SUMMARY

If the product of two square matrices AB is invertible, then both matrices A and B must also be invertible. This conclusion is established through a proof that examines the implications of singular matrices and their determinants. The discussion highlights the use of the rank-nullity theorem and elementary matrices to demonstrate that if either A or B is singular, then AB cannot be invertible. The proof methodically rules out all cases where either matrix is singular, confirming that invertibility of AB necessitates the invertibility of both A and B.

PREREQUISITES
  • Understanding of square matrices and their properties
  • Familiarity with the rank-nullity theorem
  • Knowledge of elementary matrices and their role in matrix operations
  • Basic concepts of linear transformations and their representations
NEXT STEPS
  • Study the rank-nullity theorem in detail
  • Learn about the properties of elementary matrices and their applications
  • Explore proofs involving determinants and their implications for matrix invertibility
  • Investigate linear transformations and their relationship with matrix operations
USEFUL FOR

Mathematicians, linear algebra students, and educators seeking to deepen their understanding of matrix theory and invertibility conditions.

Bipolarity
Messages
773
Reaction score
2
If AB is invertible, then A and B are invertible for square matrices A and B.

I am curious about the proof of the above. I know there is a very straightforward proof that involves determinants, but I am interested in seeing if there is a proof that doesn't use determinants.

In an attempt to proof this, I considered the contrapositive:
If at least one of {A,B} is singular, then AB is singular.

I successfully proved that if B is singular (or if both A and B are singular), then AB is necessarily singular. To do this, I showed that Bx = 0 having nontrivial solutions implies that ABx= 0 has nontrivial solutions.

Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions?

BiP
 
Physics news on Phys.org
It is a famous result for matrices (and only for matrices) that if a matrix C has a left inverse, then it has an inverse. Thus if there exists a matrix D such that DC=I, then CD=I as well. The same holds of course for C having a right inverse.

Do you know this result? It follows essentially from the rank-nullity theorem. You can use that to prove your result.

Another way to prove it. There is a result that if A is singular, then there exists a vector c such that Ax=c has no solution. Can you prove this result?? (again, it basically is rank-nullity) Can you use this to prove your theorem?
 
Bipolarity said:
I successfully proved that if B is singular (or if both A and B are singular), then AB is necessarily singular. To do this, I showed that Bx = 0 having nontrivial solutions implies that ABx= 0 has nontrivial solutions.

Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions?
Use a row vector rather than a column vector. If A is singular then xA=0 has nontrivial solutions.
 
If ##AB## has an inverse ##(AB)^{-1}##, this means that ##AB(AB)^{-1}=I##. Can you use this to find a matrix ##C## such that ##AC=I##?
If you find such a ##C##, then also ##CA=I##, as micromass pointed out.
 
If you think of a square matrix a linear mapping the it is invertible only if it is 1 to 1 and onto.

This means that it can only send zero to zero and no other vector.
If A or B were not invertible then there would be a vector v such that either B.v = 0 in which case AB.v = 0 so AB is not invertible or if B is invertible but A is not with Av= 0 then AB(B^{-1}v) = 0 and AB is not invertible.
 
do you know that matrices represent linear maps between linear spaces? if so do you know the basic theory of dimension of linear spaces?

there is a fundamental formula that dim(image) + dim(kernel) = dim(source).

it follows from this that if either map has a non trivial kernel, the image of the composition cannot be the whole target space.
 
This illustrates what's so great about linear algebra. You could prove that matrix multiplication is associative by brute force algebra (yuck!) or you could just observe that it's obvious because composing functions is associative and matrices represent functions. Actually, matrix multiplication is defined to be the way it is, precisely so that the matrix of the composition of the functions is the product of the matrices of the functions being composed. This is another example of that sort of phenomenon, although the algebraic proof isn't too hard, as people have hinted at.
 
I would find it nice if the OP would return to this thread and give us some feedback. Did he manage to prove it? Were we clear enough with our suggestions? Is he totally confused by everything we said?
 
Consider this. If AB is defined and (AB)-1 exists, then there are only four possibilities. 1: A and B are both invertible. 2: A is invertible and B is singular. 3: A is singular and B is invertible. 4: A and B are both singular. Go through each case, and you'll see that A and B have to be invertible if (AB)-1 exists. (you should receive a contradiction to the hypothesis that AB is invertible for cases 2-4). Once you rule out the other cases, case 1 is the only possibility left. Use elementary matrices in your argument if u don't want to use determinants.
 
  • #10
If A and B are square matrices and (AB)-1 exists, then A is invertible and B is invertible.
Proof: If AB is defined and (AB)-1 exists, then there are four possibilities: A and B are both invertible, A is invertible and B is singular, A is singular and B is invertible, or A and B are both singular.

Case 1: Trivial

Case 2: Let A be an invertible matrix and B be a singular matrix, let AB be defined, and let (AB)-1 exist.
(AB)(AB)-1= I​
Since (AB)-1 exists, it can be written as a product of elementary matrices.
(AB)E1E2\cdotsEn= I​
AB = En-1En-1-1\cdotsE1-1
A-1AB = A-1En-1En-1-1\cdotsE1-1
B = A-1En-1En-1-1\cdotsE1-1
Since A-1 exists, it too can be written as a product of elementary matrices.
∴ B is invertible, as it can be written as a product of elementary matrices.​
This is a contradiction to the hypothesis that B is singular.
\rightarrow\leftarrow​

(This also proves that case 3 can't be true.)

Case 4: Let A and B both be singular matrices, let AB be defined.

There exists some number of row operations that will turn A into its reduces row echelon form, so there exists some product of elementary matrices that will achieve the same purpose.
E1E2\cdotsErA = R
A = Er-1\cdotsE1-1R
AB = Er-1\cdotsE1-1RB​
Now, the product RB will contain a row of all zeroes, as R is the reduced row echelon form of a singular matrix. \Rightarrow there is no matrix C such that
C(AB) = (AB)C = I
∴If A and B are singular, then the product AB can't be invertible​

The only case left is case 1, which is trivially true.

∴ If AB is defined and (AB)-1 exists, then A is invertible and B is invertible. \triangleleft​
 
  • #11
Thank you for writing it all out! It must have been a pain! I'm sorry to ask this but could you outline the proof for Case 3 also? It's the one I have the most trouble with. Thanks!

BiP
 
  • #12
Proving case 3 is exactly the same as proving case 2, although we will be multiplying on the right instead of the left.

Case 3
: Let A be a singular matrix and let B be an invertible matrix, let AB be defined, and let (AB)-1 exist. We know that (AB)(AB)-1=I. As with case 2, we can write (AB)-1 as a product of elementary matrices.

(AB)(AB)-1 = I
(AB)E1E2\cdotsEn = I​

Elementary matrices are invertible.

AB = En-1En-1-1\cdotsE1-1

We hypothesized that B was invertible.

(AB)B-1 = (En-1En-1-1\cdotsE1-1)B-1

B-1 can be written as a product of elementary matrices, and by the associative law, (AB)B-1=A(BB-1)=AI=A

A = En-1En-1-1\cdotsE1-1(F1F2\cdotsFk)

∴ A is invertible, as it can be written as a product of elementary matrices.​

This is a contradiction to the hypothesis that A is singular.
\rightarrow\leftarrow​

This proves case 3 isn't possible.

(Sorry i took so long in replying to you)
 
  • #13
Bipolarity said:
Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions?

If B is nonsingular, you can write y = B-1x, and y = 0 if and only if x = 0.

Now think about Ax and ABy.
 
  • #14
An indirect way to prove this is to first show that a square matrix is invertible if and only if its determinant is not 0. Then if AB is invertible, det(AB) is not 0. But det(A)(B)= det(AB) so neither det(A) nor det(B) is 0.
 
  • #15
Why complicating things?

If AB is invertible, then AB(AB)-1=I, which means that A has the inverse B(AB)-1.
Similarly, (AB)-1AB=I means that B has the inverse (AB)-1A.
 
  • Like
Likes   Reactions: ron_backal and HallsofIvy

Similar threads

  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 40 ·
2
Replies
40
Views
6K
  • · Replies 15 ·
Replies
15
Views
2K