1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Inverse of a matrix

  1. Dec 28, 2007 #1
    Question: Let A and B be nxn matrices such that AB is invertible. Prove that A and B are invertible.

    All I have so far is that there exists a matrix C such that
    (AB)C = I and C(AB) = I.

    How do I use this to show that there exists D such that AD = DA = I and that there exists E such that BE = EB = I ???
    Last edited: Dec 28, 2007
  2. jcsd
  3. Dec 28, 2007 #2
    If no such D and E exist, then there are no such D and E such that

    I = EB = E(DA)B = (ED)(AB),

    contradicting the existence of C. Is that correct?
  4. Dec 28, 2007 #3


    User Avatar
    Staff Emeritus
    Science Advisor

    Are you specifically looking for a "direct" proof? If not, then the simplest argument is: A matrix is invertible if and only if its determinant is non-zero. If AB is invertible, its determinant is non-zero. Also det(AB)= det(A)det(B). For a product of two numbers to be non-zero, neither can be zero- det(A) is non-zero so A is invertible; det(B) is non-zero so B is invertible.

    I would agree that a direct proof, not using the determinant, would be preferable. For that, you will have to be careful. It is NOT true, in general, that if, for two functions f and g, f(g(x)) has an inverse, then f and g separately have inverses. To prove this for matrices (i.e. representing linear transformations) you will need to use the "linearity". In particular, if B is NOT invertible, then there must exist a non-zero vector, v, such that Bv= 0. But then ABv= A(Bv)= A0= 0.
  5. Dec 28, 2007 #4


    User Avatar
    Staff Emeritus
    Science Advisor

    I'm not sure it makes a lot of sense to say if "no such D and E" exist, and then write an equation with D and E! As I said before, the statement "if f(g(x)) has an inverse, then f(x) and g(x) must have inverses", for general functions, f and g, is NOT true. Let g:{a, b, c}-> {x} be defined by g(a)= x, g(b)= x, g(c)= x and f:{x}-> {y} be defined by f(x)= y. Then f(g) has no inverse because it maps all of {a, b, c} into y and is not "one to one". But g DOES have an inverse.
    Last edited: Dec 28, 2007
  6. Dec 28, 2007 #5
    Suppose B is singular then there exists a nonzero vector v such that Bv = 0 hence
    (AB)v = A(Bv) = A(0) = 0 but AB is nonsingular so v must equal zero.

    Similar situation for A as well.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Inverse of a matrix
  1. Inverse matrix (Replies: 3)

  2. Matrix inversion (Replies: 6)

  3. Inverse of a Matrix (Replies: 2)

  4. Matrix inverse? (Replies: 2)

  5. Inverse matrix (Replies: 6)