
#1
Dec612, 10:56 AM

P: 783

If AB is invertible, then A and B are invertible for square matrices A and B.
I am curious about the proof of the above. I know there is a very straightforward proof that involves determinants, but I am interested in seeing if there is a proof that doesn't use determinants. In an attempt to proof this, I considered the contrapositive: If at least one of {A,B} is singular, then AB is singular. I successfully proved that if B is singular (or if both A and B are singular), then AB is necessarily singular. To do this, I showed that Bx = 0 having nontrivial solutions implies that ABx= 0 has nontrivial solutions. Unfortunately, I was not able to apply the above step to the case where only A is singular. If A is singular, Ax= 0 has nontrivial solutions. But how can I show that ABx = 0 has nontrivial solutions? BiP 



#2
Dec612, 11:13 AM

Mentor
P: 16,651

It is a famous result for matrices (and only for matrices) that if a matrix C has a left inverse, then it has an inverse. Thus if there exists a matrix D such that DC=I, then CD=I as well. The same holds of course for C having a right inverse.
Do you know this result? It follows essentially from the ranknullity theorem. You can use that to prove your result. Another way to prove it. There is a result that if A is singular, then there exists a vector c such that Ax=c has no solution. Can you prove this result?? (again, it basically is ranknullity) Can you use this to prove your theorem? 



#3
Dec612, 12:31 PM

Mentor
P: 14,459





#4
Dec612, 06:50 PM

P: 303

If AB is invertible, then A and B are invertible.
If ##AB## has an inverse ##(AB)^{1}##, this means that ##AB(AB)^{1}=I##. Can you use this to find a matrix ##C## such that ##AC=I##?
If you find such a ##C##, then also ##CA=I##, as micromass pointed out. 



#5
Dec612, 08:56 PM

Sci Advisor
P: 1,716

If you think of a square matrix a linear mapping the it is invertible only if it is 1 to 1 and onto.
This means that it can only send zero to zero and no other vector. If A or B were not invertible then there would be a vector v such that either B.v = 0 in which case AB.v = 0 so AB is not invertible or if B is invertible but A is not with Av= 0 then AB(B[itex]^{1}[/itex]v) = 0 and AB is not invertible. 



#6
Dec612, 10:33 PM

Sci Advisor
HW Helper
P: 9,422

do you know that matrices represent linear maps between linear spaces? if so do you know the basic theory of dimension of linear spaces?
there is a fundamental formula that dim(image) + dim(kernel) = dim(source). it follows from this that if either map has a non trivial kernel, the image of the composition cannot be the whole target space. 



#7
Dec812, 12:04 AM

P: 1,039

This illustrates what's so great about linear algebra. You could prove that matrix multiplication is associative by brute force algebra (yuck!) or you could just observe that it's obvious because composing functions is associative and matrices represent functions. Actually, matrix multiplication is defined to be the way it is, precisely so that the matrix of the composition of the functions is the product of the matrices of the functions being composed. This is another example of that sort of phenomenon, although the algebraic proof isn't too hard, as people have hinted at.




#8
Dec912, 01:55 AM

Mentor
P: 16,651

I would find it nice if the OP would return to this thread and give us some feedback. Did he manage to prove it? Were we clear enough with our suggestions? Is he totally confused by everything we said?




#9
Dec2212, 06:37 PM

P: 7

Consider this. If AB is defined and (AB)^{1} exists, then there are only four possibilities. 1: A and B are both invertible. 2: A is invertible and B is singular. 3: A is singular and B is invertible. 4: A and B are both singular. Go through each case, and you'll see that A and B have to be invertible if (AB)^{1} exists. (you should receive a contradiction to the hypothesis that AB is invertible for cases 24). Once you rule out the other cases, case 1 is the only possibility left. Use elementary matrices in your argument if u don't want to use determinants.




#10
Dec2312, 09:43 PM

P: 7

If A and B are square matrices and (AB)^{1} exists, then A is invertible and B is invertible.
Proof: If AB is defined and (AB)^{1} exists, then there are four possibilities: A and B are both invertible, A is invertible and B is singular, A is singular and B is invertible, or A and B are both singular. Case 1: Trivial Case 2: Let A be an invertible matrix and B be a singular matrix, let AB be defined, and let (AB)^{1} exist. (AB)(AB)^{1}= I Since (AB)^{1} exists, it can be written as a product of elementary matrices.(AB)E_{1}E_{2}[itex]\cdots[/itex]E_{n}= I AB = E_{n}^{1}E_{n1}^{1}[itex]\cdots[/itex]E_{1}^{1} Since A^{1} exists, it too can be written as a product of elementary matrices. A^{1}AB = A^{1}E_{n}^{1}E_{n1}^{1}[itex]\cdots[/itex]E_{1}^{1} B = A^{1}E_{n}^{1}E_{n1}^{1}[itex]\cdots[/itex]E_{1}^{1} ∴ B is invertible, as it can be written as a product of elementary matrices. This is a contradiction to the hypothesis that B is singular. [itex]\rightarrow\leftarrow[/itex] (This also proves that case 3 can't be true.) Case 4: Let A and B both be singular matrices, let AB be defined. There exists some number of row operations that will turn A into its reduces row echelon form, so there exists some product of elementary matrices that will achieve the same purpose. E_{1}E_{2}[itex]\cdots[/itex]E_{r}A = R Now, the product RB will contain a row of all zeroes, as R is the reduced row echelon form of a singular matrix. [itex]\Rightarrow[/itex] there is no matrix C such that A = E_{r}^{1}[itex]\cdots[/itex]E_{1}^{1}R AB = E_{r}^{1}[itex]\cdots[/itex]E_{1}^{1}RB C(AB) = (AB)C = I ∴If A and B are singular, then the product AB can't be invertible The only case left is case 1, which is trivially true. ∴ If AB is defined and (AB)^{1} exists, then A is invertible and B is invertible. [itex]\triangleleft[/itex]




#11
Jan813, 11:22 PM

P: 783

Thank you for writing it all out! It must have been a pain! I'm sorry to ask this but could you outline the proof for Case 3 also? It's the one I have the most trouble with. Thanks!
BiP 



#12
Sep2513, 03:52 PM

P: 7

Proving case 3 is exactly the same as proving case 2, although we will be multiplying on the right instead of the left.
Case 3: Let A be a singular matrix and let B be an invertible matrix, let AB be defined, and let (AB)^{1} exist. We know that (AB)(AB)^{1}=I. As with case 2, we can write (AB)^{1} as a product of elementary matrices. (AB)(AB)^{1} = I (AB)E_{1}E_{2}[itex]\cdots[/itex]E_{n} = I Elementary matrices are invertible. AB = E_{n}^{1}E_{n1}^{1}[itex]\cdots[/itex]E_{1}^{1} We hypothesized that B was invertible. (AB)B^{1} = (E_{n}^{1}E_{n1}^{1}[itex]\cdots[/itex]E_{1}^{1})B^{1} B^{1} can be written as a product of elementary matrices, and by the associative law, (AB)B^{1}=A(BB^{1})=AI=A A = E_{n}^{1}E_{n1}^{1}[itex]\cdots[/itex]E_{1}^{1}(F_{1}F_{2}[itex]\cdots[/itex]F_{k}) ∴ A is invertible, as it can be written as a product of elementary matrices. This is a contradiction to the hypothesis that A is singular. [itex]\rightarrow\leftarrow[/itex] This proves case 3 isn't possible. (Sorry i took so long in replying to you) 



#13
Sep2513, 04:39 PM

Engineering
Sci Advisor
HW Helper
Thanks
P: 6,356

Now think about Ax and ABy. 



#14
Sep2713, 08:26 AM

Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,886

An indirect way to prove this is to first show that a square matrix is invertible if and only if its determinant is not 0. Then if AB is invertible, det(AB) is not 0. But det(A)(B)= det(AB) so neither det(A) nor det(B) is 0.




#15
Sep2713, 02:57 PM

P: 303

Why complicating things?
If AB is invertible, then AB(AB)^{1}=I, which means that A has the inverse B(AB)^{1}. Similarly, (AB)^{1}AB=I means that B has the inverse (AB)^{1}A. 


Register to reply 
Related Discussions  
Prove: For T Compact, left or right invertible implies invertible  Calculus & Beyond Homework  1  
if AB^2A Invertible Prove that BAA Invertible  Calculus & Beyond Homework  5  
Are these invertible? Why or why not?  Calculus & Beyond Homework  5  
Prove that asquare matrix A is invertible if nad only if A[sup]T[/sup]A is invertible  Calculus & Beyond Homework  3  
[SOLVED] Two square invertible matrices, prove product is invertible  Calculus & Beyond Homework  1 