Register to reply

If AB is invertible, then A and B are invertible.

by Bipolarity
Tags: invertible
Share this thread:
Bipolarity
#1
Dec6-12, 10:56 AM
P: 783
If AB is invertible, then A and B are invertible for square matrices A and B.

I am curious about the proof of the above. I know there is a very straightforward proof that involves determinants, but I am interested in seeing if there is a proof that doesn't use determinants.

In an attempt to proof this, I considered the contrapositive:
If at least one of {A,B} is singular, then AB is singular.

I successfully proved that if B is singular (or if both A and B are singular), then AB is necessarily singular. To do this, I showed that Bx = 0 having nontrivial solutions implies that ABx= 0 has nontrivial solutions.

Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions?

BiP
Phys.Org News Partner Science news on Phys.org
World's largest solar boat on Greek prehistoric mission
Google searches hold key to future market crashes
Mineral magic? Common mineral capable of making and breaking bonds
micromass
#2
Dec6-12, 11:13 AM
Mentor
micromass's Avatar
P: 18,038
It is a famous result for matrices (and only for matrices) that if a matrix C has a left inverse, then it has an inverse. Thus if there exists a matrix D such that DC=I, then CD=I as well. The same holds of course for C having a right inverse.

Do you know this result? It follows essentially from the rank-nullity theorem. You can use that to prove your result.

Another way to prove it. There is a result that if A is singular, then there exists a vector c such that Ax=c has no solution. Can you prove this result?? (again, it basically is rank-nullity) Can you use this to prove your theorem?
D H
#3
Dec6-12, 12:31 PM
Mentor
P: 15,065
Quote Quote by Bipolarity View Post
I successfully proved that if B is singular (or if both A and B are singular), then AB is necessarily singular. To do this, I showed that Bx = 0 having nontrivial solutions implies that ABx= 0 has nontrivial solutions.

Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions?
Use a row vector rather than a column vector. If A is singular then xA=0 has nontrivial solutions.

Erland
#4
Dec6-12, 06:50 PM
P: 338
If AB is invertible, then A and B are invertible.

If ##AB## has an inverse ##(AB)^{-1}##, this means that ##AB(AB)^{-1}=I##. Can you use this to find a matrix ##C## such that ##AC=I##?
If you find such a ##C##, then also ##CA=I##, as micromass pointed out.
lavinia
#5
Dec6-12, 08:56 PM
Sci Advisor
P: 1,716
If you think of a square matrix a linear mapping the it is invertible only if it is 1 to 1 and onto.

This means that it can only send zero to zero and no other vector.
If A or B were not invertible then there would be a vector v such that either B.v = 0 in which case AB.v = 0 so AB is not invertible or if B is invertible but A is not with Av= 0 then AB(B[itex]^{-1}[/itex]v) = 0 and AB is not invertible.
mathwonk
#6
Dec6-12, 10:33 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,453
do you know that matrices represent linear maps between linear spaces? if so do you know the basic theory of dimension of linear spaces?

there is a fundamental formula that dim(image) + dim(kernel) = dim(source).

it follows from this that if either map has a non trivial kernel, the image of the composition cannot be the whole target space.
homeomorphic
#7
Dec8-12, 12:04 AM
P: 1,196
This illustrates what's so great about linear algebra. You could prove that matrix multiplication is associative by brute force algebra (yuck!) or you could just observe that it's obvious because composing functions is associative and matrices represent functions. Actually, matrix multiplication is defined to be the way it is, precisely so that the matrix of the composition of the functions is the product of the matrices of the functions being composed. This is another example of that sort of phenomenon, although the algebraic proof isn't too hard, as people have hinted at.
micromass
#8
Dec9-12, 01:55 AM
Mentor
micromass's Avatar
P: 18,038
I would find it nice if the OP would return to this thread and give us some feedback. Did he manage to prove it? Were we clear enough with our suggestions? Is he totally confused by everything we said?
marxLynx
#9
Dec22-12, 06:37 PM
P: 7
Consider this. If AB is defined and (AB)-1 exists, then there are only four possibilities. 1: A and B are both invertible. 2: A is invertible and B is singular. 3: A is singular and B is invertible. 4: A and B are both singular. Go through each case, and you'll see that A and B have to be invertible if (AB)-1 exists. (you should receive a contradiction to the hypothesis that AB is invertible for cases 2-4). Once you rule out the other cases, case 1 is the only possibility left. Use elementary matrices in your argument if u don't want to use determinants.
marxLynx
#10
Dec23-12, 09:43 PM
P: 7
If A and B are square matrices and (AB)-1 exists, then A is invertible and B is invertible.
Proof: If AB is defined and (AB)-1 exists, then there are four possibilities: A and B are both invertible, A is invertible and B is singular, A is singular and B is invertible, or A and B are both singular.

Case 1: Trivial

Case 2: Let A be an invertible matrix and B be a singular matrix, let AB be defined, and let (AB)-1 exist.
(AB)(AB)-1= I
Since (AB)-1 exists, it can be written as a product of elementary matrices.
(AB)E1E2[itex]\cdots[/itex]En= I
AB = En-1En-1-1[itex]\cdots[/itex]E1-1
A-1AB = A-1En-1En-1-1[itex]\cdots[/itex]E1-1
B = A-1En-1En-1-1[itex]\cdots[/itex]E1-1
Since A-1 exists, it too can be written as a product of elementary matrices.
∴ B is invertible, as it can be written as a product of elementary matrices.
This is a contradiction to the hypothesis that B is singular.
[itex]\rightarrow\leftarrow[/itex]

(This also proves that case 3 can't be true.)

Case 4: Let A and B both be singular matrices, let AB be defined.

There exists some number of row operations that will turn A into its reduces row echelon form, so there exists some product of elementary matrices that will achieve the same purpose.
E1E2[itex]\cdots[/itex]ErA = R
A = Er-1[itex]\cdots[/itex]E1-1R
AB = Er-1[itex]\cdots[/itex]E1-1RB
Now, the product RB will contain a row of all zeroes, as R is the reduced row echelon form of a singular matrix. [itex]\Rightarrow[/itex] there is no matrix C such that
C(AB) = (AB)C = I
∴If A and B are singular, then the product AB can't be invertible

The only case left is case 1, which is trivially true.

∴ If AB is defined and (AB)-1 exists, then A is invertible and B is invertible. [itex]\triangleleft[/itex]
Bipolarity
#11
Jan8-13, 11:22 PM
P: 783
Thank you for writing it all out! It must have been a pain! I'm sorry to ask this but could you outline the proof for Case 3 also? It's the one I have the most trouble with. Thanks!

BiP
marxLynx
#12
Sep25-13, 03:52 PM
P: 7
Proving case 3 is exactly the same as proving case 2, although we will be multiplying on the right instead of the left.

Case 3
: Let A be a singular matrix and let B be an invertible matrix, let AB be defined, and let (AB)-1 exist. We know that (AB)(AB)-1=I. As with case 2, we can write (AB)-1 as a product of elementary matrices.

(AB)(AB)-1 = I
(AB)E1E2[itex]\cdots[/itex]En = I

Elementary matrices are invertible.

AB = En-1En-1-1[itex]\cdots[/itex]E1-1

We hypothesized that B was invertible.

(AB)B-1 = (En-1En-1-1[itex]\cdots[/itex]E1-1)B-1

B-1 can be written as a product of elementary matrices, and by the associative law, (AB)B-1=A(BB-1)=AI=A

A = En-1En-1-1[itex]\cdots[/itex]E1-1(F1F2[itex]\cdots[/itex]Fk)

∴ A is invertible, as it can be written as a product of elementary matrices.

This is a contradiction to the hypothesis that A is singular.
[itex]\rightarrow\leftarrow[/itex]

This proves case 3 isn't possible.

(Sorry i took so long in replying to you)
AlephZero
#13
Sep25-13, 04:39 PM
Engineering
Sci Advisor
HW Helper
Thanks
P: 6,953
Quote Quote by Bipolarity View Post
Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions?
If B is nonsingular, you can write y = B-1x, and y = 0 if and only if x = 0.

Now think about Ax and ABy.
HallsofIvy
#14
Sep27-13, 08:26 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 39,345
An indirect way to prove this is to first show that a square matrix is invertible if and only if its determinant is not 0. Then if AB is invertible, det(AB) is not 0. But det(A)(B)= det(AB) so neither det(A) nor det(B) is 0.
Erland
#15
Sep27-13, 02:57 PM
P: 338
Why complicating things?

If AB is invertible, then AB(AB)-1=I, which means that A has the inverse B(AB)-1.
Similarly, (AB)-1AB=I means that B has the inverse (AB)-1A.


Register to reply

Related Discussions
Prove: For T Compact, left or right invertible implies invertible Calculus & Beyond Homework 1
If AB^2-A Invertible Prove that BA-A Invertible Calculus & Beyond Homework 5
Are these invertible? Why or why not? Calculus & Beyond Homework 5
Prove that asquare matrix A is invertible if nad only if A[sup]T[/sup]A is invertible Calculus & Beyond Homework 3
Two square invertible matrices, prove product is invertible Calculus & Beyond Homework 1