1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Showing that a matrix is invertible

  1. Oct 16, 2008 #1
    I'm a bit confused how I would solve these problems of matrix inversion. I know the basic properties of inversion, but need some explanation on how to prove or disprove questions such as the ones below: I need to justify each these as true or false.

    1. If A is an invertible, real matrix... is the matrix A+2(A^-1) invertible?

    2. And also, if a matrix B satisfies: (B^3)-2B+I=0, then the matrix (B-I) cannot be invertible?

    I would appreciate any help on how to solve these problems, or even if someone could provide me with some properties that I can use to solve them.

    I know I saw an example asking if A+(A^1) is invertible, given that A is invertible; in this example A+(A^1) was said to be equal to (A^2)+I, and then the problem was solved from here; why are these two equations equal? The extent of what I know about invertible matrices is how to solve for the inverse of A with the formula, and that A(A^(-1))=I=(A^(-1))A, and how to solve inverse matrices using elementary row operations...

    Thanks for the help!
  2. jcsd
  3. Oct 16, 2008 #2


    Staff: Mentor

    For the second one (I'm still working on the first), the cubic on the left side can be factored.
    B^3 - 2B + I = 0
    => (B - I)(B^2 + B + I) = 0
    One solution is B = I, but there are two other solutions. Keep in mind that although there was a restriction on the A in the first problem, the same restriction did not appear for B in the second problem.

    A fine point: You said "why are these two equations equal?" One equation is never equal to another. One equation might imply another or the two equations might be equivalent (in which each equation implies the other).
  4. Oct 16, 2008 #3


    User Avatar
    Science Advisor
    Homework Helper

    These aren't equations -- they're matrices -- and they aren't (necessarily) equal.

    In fact, A+A-1 = A-1(A^2 + I). This is what I'm assuming your source has done. Maybe this will help you do question 1 now. If you need further help, post back.
  5. Oct 16, 2008 #4


    Staff: Mentor

    Doing the second problem gave me an idea for the first. An important idea in determining whether a matrix has an inverse is knowing how many solutions there are to the equation Ax = 0. Here A is the square matrix we're considering and x is a column vector with as many entries as there are rows or columns in A.

    Obviously one solution to the equation Ax = 0 is x = [0 0 ... 0]^T. Another possibility is that A is the 0 matrix, a square matrix all of whose entries are 0. For some square matrices it's possible that Ax = 0, even when A is not the zero matrix and x is not the zero vector. Where I'm going with this is, if the equation Ax = 0 has exactly one solution (the obvious one), then A has an inverse.

    So let (A + 2A^-1)x = 0
    Clearly, if x = 0, the equation above is true, but that's not interesting.

    Multiply each side on the left by A:
    A * (A + 2A^-1)x = A*0
    Matrix multiplication is distributive, so
    (A^2 + 2A*A^-1)x = 0

    (A^2 + 2I)x = 0

    One solution is x = 0. Is it possible for A^2 + 2I to be equal to the 0 matrix? If so, then A + 2A^-1 can't have an inverse.

    The characteristic polynomial y^2 + 2 = 0 might be helpful here.
    Last edited: Oct 16, 2008
  6. Oct 17, 2008 #5
    Hi Mark
    Thanks for your explanations. I've looked over your description for part 2; I understand what you are saying about setting (A + 2A^-1)x = 0; this is because if there is a solution for (A + 2A^-1) = 0, then the matrix cannot be invertible. You are saying that you need there to be exactly that one trivial solution of x=0 (because no matrix can bring 0 back to x...), correct?

    So, you multiplied each side by A, so that the equation will have an A(A^(-1)) which can be set to I. So now, you need to prove that A^2 + 2I cannot equal zero if it's invertible?

    What do you mean by the characteristic polynomial y^2 + 2 = 0 ? Where did you get this polynomial from?

    Could I solve this by setting A = to an arbitrary matrix, squaring it and then adding 2*the identity matrix?

    I apologize for all the questions; I haven't taken any course in Math for 4 years! Thanks again!
  7. Oct 17, 2008 #6
    morphism, I was just confused how you get A+(A^-1) = (A^-1)(A^2 + I); how does this solve out... I'm sure it's just simple math, but just wondering how you obtain this; I really don't have much knowledge in inverses...? Is this from setting to 0, and then multiplying by A(A^-1)?

  8. Oct 17, 2008 #7


    User Avatar
    Science Advisor
    Homework Helper

    Think back to basic algebra. Say you have something like ab+ac. We can "pull the a out" of this to get a(b+c).

    What we have here is exactly that: A+A-1 = (A-1A)A+A-1 = A-1(AA + I).
  9. Oct 17, 2008 #8


    Staff: Mentor

    This was actually the first problem. I think you're getting what I wrote, but let me summarize it.

    For an equation (A^2 + 2I)x = 0, there are three possibilities:
    1. x = 0 (not very interesting)
    2. A^2 + 2I is the zero matrix
    3. A^2 + 2I is not the zero matrix and x is not 0
    With regard to item 3 above, some matrices carry nonzero vectors to zero. The transformations represented by these matrices are not one-to-one, hence are not invertible, which means that the matrix is not invertible.

    The work I showed you had to do with 2nd point. If A^2 + 2I is the zero matrix, what does that say about A? The characteristic polynomial, you'll notice was 2nd degree, just like the matrix expression above. The coefficients of the matrix express are the same as those of characteristic polynomial, also not an accident. Solving the characteristic polynomial gives you insight into the solutions of A^2 + 2I = 0.
  10. Oct 18, 2008 #9
    Hi Mark,
    Again I apologize that I am having a tough time conceptualizing this... I have not been over the concept of characteristic polynomials before...

    I understand your explanation that I need to find if (A^2 + 2I) = 0; what I am confused about is how to go about doing this? Would I plug in an arbitrary, invertible matrix for A? And then add 2I and see if it equals zero? And then, if I find an example where this equals 0, it shows that it is not invertible?

    For the second problem, I guess I'm just lost on how to find out if it is invertible? Would I apply the same notion that if I can find (B-I) to equal zero, then the matrix would not be invertible?

  11. Oct 19, 2008 #10


    Staff: Mentor

    The characteristic polynomial for A^2 + 2I = 0 is x^2 + 2 = 0. Both equations are 2nd degree, and the coefficients of the various powers of A are the same as those for the powers of x. For the constant term, I plays role of 1, since multiplication of a square matrix A by I yields A, in a way similar to multiplying a number n by 1 yields n.

    What solutions do you get for x^2 + 2 = 0? The solutions (plural) for x will correspond to solutions A to the equation A^2 + 2I = 0.

    To help you get your mind around this concept, you can work with small (i.e., 2 x 2) matrices, but keep in mind that the problem is really more general.

    For the second problem if B - I = 0, what does that say about B. (Hint: if x - 1 = 0, can you find x?)
  12. Oct 20, 2008 #11
    Thanks again.

    That makes much sense. The only other thing is, when I find what is equal to 0, why is it that this means there would be no inverse? As you explained before, is it just because the matrices that carry nonzero vectors to zero are non-invertible?

    So if I use the concept of characteristic polynomials; with the question:
    Prove as true or false: If a real matrix B satisfies B^3 - 2B + I = 0, then the matrix (B-I) cannot be invertible...

    I would factor out (B-I) from the cubic; use the concept you explained of characteristic polynomials and set B - I = 0; then x - 1 = 0, then x = 1, or B = I would be a solution where (B - I) is not invertible? So this would then prove that the matrix (B-I) cannot be invertible?

    And if this is true, then the same concept could be applied to the first question? I appreciate all the help... I know that if the determinant = 0, then the matrix is not invertible; but I just wasn't understanding the concept of setting the matrix = 0 and proving if it is invertible this way....

  13. Oct 20, 2008 #12


    Staff: Mentor

    Yes. And in particular the zero matrix of order n (the n x n matrix all of whose entries are 0) is noninvertible. It carries every vector x to 0.
    In this case I would leave out the part about x. The characteristic polynomial is indirectly related to the matrix polynomial.
    Instead, I would say this:
    If B - I = 0, then B = I.

    (The identity matrix minus itself is the zero matrix.)
    Yes. Also, see what I already wrote about this first problem.
  14. Oct 20, 2008 #13
    Thanks Mark,
    This makes a lot more sense to me now. So on other problems that I may have for proving invertibility of arbitrary matrices, I should look for a case where the nxn matrix = 0 (let's say matrix A). If matrix A can be = to 0 (the 0 matrix), then it cannot have an A^-1, and therefore is not invertible... , correct?

    So, if I quickly make up a problem similar to the one I asked:
    If A is invertible, then A + A^-1 is also invertible?
    In this case I would apply the same concept, getting A^-1(A^2 + I). I would then set A^2 + I = 0, and try to find a case where this exists. If I use the characteristic polynomial concept you described and set x^2 + 1 = 0, I would look for a solution that would make it equal to zero, correct?

    Thanks again for clearing this up!

    One last question, I promise:
    In the case where we proved that (B-I) was not invertible (by setting B = I), would this lead one to make the conclusion that (B-I) is not invertible ever? Could there be cases where it is invertible, or if one case is proven where it does = 0, could we definitively say it is non-invertible?

  15. Oct 20, 2008 #14


    Staff: Mentor

    I posted a reply this morning, but closed my browser before the post was actually sent, so what I had put together was lost.

    Anyway, for a square matrix A, it's probably easier to show that it doesn't have an inverse than to prove that it is invertible.

    Consider the equation Ax = 0, where A is a square matrix. There are three possibilities, two of which are pretty obvious and one that isn't.
    1. x is the n-dimensional vector whose entries are all zero.
    2. A is the zero matrix.
    3. Neither of the above.

    An example of the third possibility is A = { (1 0); (0 0)} (listed row by row).
    x = (0 2)
    The product Ax is 0, even though A is not the zero matrix and x is not a 2-D zero vector.

    If A happens to be invertible, there are several things we can say, such as:
    The equation Ax = 0 has exactly one solution, the trivial solution x = 0. Also, the equation Ax = b has exactly one solution.
    det(A) != 0
    The transformation T whose matrix is A is one-to-one and onto.

    Regarding your problem about whether A + A^(-1) is invertible when A is, all I can say so far is that for two possible matrices A (A = i*I, A = -i*I), the sum above is not invertible.

    I haven't been able to establish more generally that for real matrices, A + A^(-1) is invertible, or that the sum isn't invertible. I suspect that the sum is not invertible, so I'm looking for a counterexample.

    Regarding your last question, any matrix whose entries are all zero cannot possibly be invertible. So in particular, if B - I = 0, then (B - I) is not invertible.
  16. Oct 20, 2008 #15
    Hey Mark

    Thanks for that explanation; it helped a great deal! I understand how to prove that a matrix is not invertible; to prove that AX=0 has exactly 1 solution etc.

    In regards to that last question:
    if a matrix B satisfies: (B^3)-2B+I=0, then the matrix (B-I) cannot be invertible?

    Although we found that if B = I, then (B - I) is not invertible; couldn't we also prove that it is invertible... Since the problem asks if (B-I) cannot be invertible; should I then be looking for a counterexample to prove that it can be invertible?

    If I factor it further to obtain:
    (B-I)((B-(-1/2 + sqrt(5)/2*I))(B-(-1/2 - sqrt(5)/2*I)))
    Therefore, couldn't I set B = [-1/2 + sqrt(5)/2*I];
    And then prove that although B^3 - 2B + I = 0, (B-I) would = [-3/2 +sqrt(5)/2*I]; and this would then be a case where B is invertible...?

    Can you tell me if I'm correct here, by stating that the answer would be False because there is this case where (B - I) is invertible?

    Thanks! i really appreciate all your time.
  17. Oct 21, 2008 #16


    Staff: Mentor

    Let's go back to an earlier question first, namely:
    If A is invertible, does it necessarily follow that A + A^(-1) is invertible?
    (I have changed the wording slightly, but not the meaning. The reason for doing this should become clear.)

    The answer is no. A single counterexample sufficient to disprove the statement above. A counterexample is one in which the hypothesis (the if part) is satisfied, but the opposite conclusion (the then part) is reached.
    A = {(1 -1); (2 -1)} (in rows)

    A has an inverse, namely A^(-1) = {(-1 1); (-2 1)} (in rows)

    It's easy to verify that these matrices are inverses, since their product is the 2 x 2 identity matrix.

    A + A^(-1) = 0, which for reasons already given is not invertible.

    On to the next question. If a matrix B satisfies B^3 - 2B + I = 0, then B - I cannot be invertible.
    We have B^3 - 2B + I = 0.
    The characteristic polynomial here is x^3 -2x + 1.
    By long division, it can be seen that the polynomial above can be factored as (x - 1)(x^2 + x -1). The matrix polynomial factors in nearly the same way, with powers of x becoming powers of the matrix B, and constants becoming the constant times I.

    This means that the matrix equation can be factored as
    (B - I)(B^2 + B - I).

    Since B^3 - 2B + I = 0, then
    (B - I)(B^2 + B - I) = 0,
    so either B = I or B^2 + B - I = 0.

    If B = I, then B - I is the zero matrix, so B - I is not invertible (which should be abundantly clear by now!)

    I have now proved the statement. Unless there is a flaw in my logic, you will not be able to find so much as a single counterexample, try as you might.

    Now, since the problem gave you the cubic polynomial and asked whether B - I could be invertible (no), you're really finished. It didn't ask for matrix solutions of the equation, just whether B - I was invertible. Note especially that it did NOT ask whether B was invertible.

    On the other hand, if the problem asked for solutions of the matrix polynomial, then we have already found one, namely B = I. The other solutions B are solutions of the quadratic, B^2 - 2B + I = 0. The solutions of the related characteristic equation, x^2 -2x + 1 = 0,
    are x = -1/2 + sqrt(5)/2 and x = -1/2 - sqrt(5)/2.

    The solutions of the matrix quadratic are B = [-1/2 + sqrt(5)/2] * I and B = [-1/2 - sqrt(5)/2] * I.

    The first of these matrices has entries of -1/2 + sqrt(5)/2 on the main diagonal and zeros everywhere else. The second of these has -1/2 - sqrt(5)/2 on the main diagonal and zeros everywhere else.

    Again, what we've done in this new problem is to find the solutions, B, of B^3 - 2B + I = 0. We did not establish whether those matrices were invertible. (In fact, though, all three are, but I'll leave that for you to investigate.)

  18. Oct 21, 2008 #17
    Thanks Mark!

    It's very clear now.
    I appreciate it tons!
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook