Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Please check my solution to this invertibility test

  1. Mar 18, 2012 #1
    Q. Given that A is invertible , check whether A + A^(-1) is invertible or not ?

    A. A is invertible . => A^-1 exists

    ( sorry about not using the powers directly. i am not able to click on the required icon ).

    Suppose a non trivial solution exists to ( A + A^(-1) ) X = 0
    ( which means lets assume A + A^(-1) is non - invertible ).

    => a non trivial solution exists to AX = A^(-1) ( -X)

    or to AX = A A^(-2) ( -X)

    or to A [ A^(-2) + I ] X = 0

    clearly , [ A^(-2) + I ] X = Y is equal to 0 as A is invertible

    i.e. [ A^(-2) + I ] X =0

    We also know that a non trivial solution for X exists
    => [ A^(-2) + I ] is non - invertible.

    => A^(-2) must have an eigen value = -1
    => A also must have an eigen value = -1 ( by the theorem of eigen value decomp.)

    => A + A^(-1) is non-invertible only when the matrix A has an eigen value = -1.
    Hence, generally speaking, it is not invertible.

    Thanks a lot.
     
    Last edited: Mar 18, 2012
  2. jcsd
  3. Mar 18, 2012 #2

    chiro

    User Avatar
    Science Advisor

    Hey vish_maths and welcome to the forums.

    If Det(A + A^(-1)) is non-zero then A + A^(-1) must not be linear combinations of each other in terms of rows or columns. We know that A has this property and A^(-1).

    From the eigenvalue decomposition we have:

    A = QBQ^(-1) and A^-1 = QB^-1Q^-1 for our eigenvalue diagonal matrix. This implies that A + A^(-1) = D = QBQ^(-1) + QB^(-1)Q^(-1) = Q(BQ^(-1) + B^(-1)Q^(-1)). Now this has to be invertible. Now we know that Q is invertible so lets post multiply everything by Q:

    DQ = Q(BQ^(-1) + B^Q(-1))Q = Q(BQ^(-1)Q + B^(-1)Q^(-1)Q) = Q(B + B^(-1)). Now premultiply by Q^-1 and we get Q^(-1)DQ = B^(-1) + B. Since Q is invertible and since D needs to be invertible we know that:

    Det(Q^(-1)DQ) = Det(Q)Det(D)Det(Q^-1) = Det(D) = Det(B^(-1) + B) <> 0 has to hold.
    Now the B matrix represents the eigenvalues as a diagonal matrix with only the eigenvalues.

    This means that B^(-1) + B has along the diagonals e_m + 1/e_m for m = 1 to the Dim(A) where Dim(A) is the dimension of the matrix. For this to be invertible e_m + 1/e_m has to be non-zero. This implies e_m + 1/e_m <> 0 which implies e_m <> 1/e_m which implies (e_m)^2 <> 0 which implies that e_m is not 0.

    So as long you don't have a zero eigenvalue you should be ok. But you only get a zero-eigenvalue if A is not invertible, which means that you will never encounter this situation which proves that A + A^(-1) is invertible provided A is invertible.

    I might have made an error so please double check this for any bad assumptions or bad parts of the proof.
     
  4. Mar 18, 2012 #3
    hello chiro :)

    Thanks for the reply.
    The eigen value decomposition is actually of the form

    AQ = QB
    where Q represents the eigen vector matrix and B is the diagonal eigen value matrix.

    A is invertible does not probably imply that it's eigen vector matrix too ought to be invertible.

    Hence, i think A cannot be written in the form A = Q B Q^(-1) in all cases.

    Please confirm and lets discuss in case there is any discrepancy.
    Thanks :)
     
    Last edited: Mar 18, 2012
  5. Mar 18, 2012 #4

    chiro

    User Avatar
    Science Advisor

    Yes it does. If A is invertible then it does not have any zero eigenvalue and since the eignevalue matrix is diagonal with the eigenvalues as the entries then it must be invertible if every diagonal entry is non-zero.

    If you want more information: take a look at this:

    http://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix#Eigendecomposition_of_a_matrix

    This is where I got the formulas for the standard decomposition of A and its inverse.
     
  6. Mar 18, 2012 #5

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    There must be something wrong here, because if
    ## A = \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} ##, then ## A^{-1} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} ##, and ## A + A^{-1} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix} ##
    But I'm not feeling energetic enough right now to find your mistake :rolleyes:
     
  7. Mar 18, 2012 #6

    chiro

    User Avatar
    Science Advisor

    Yeah you're right that there is a mistake (good example btw).

    I'll have a look at it later when I have some free time.
     
  8. Mar 19, 2012 #7
    hii chiro :)

    I am not talking about the diagonalizability of the eigen value matrix. I meant that provided A is invertible, there is no guarantee that the eigen Vector matrix has to be invertible
    In your case, you assumed Q to be invertible but we dont know whether Q is invertible or not, though in case of a non zero eigen value the eigen value matrix shall always be invertible :)

    In short, we do not know whether the eigen vectors obtained are linearly independent or not .
    so A can not be written in the format A = Q B Q^(-1)
    because we dont know whether Q^(-1) exists or not :)

    thanks
     
    Last edited: Mar 19, 2012
  9. Mar 19, 2012 #8
    pleae have a look at my argument just above as well. Thanks :)
     
  10. Mar 19, 2012 #9

    chiro

    User Avatar
    Science Advisor

    Yeah you're both right. The proof assumes Q is invertible which means it all falls apart.

    I guess what you could do however is find out when Q is actually invertible. That might help you get the condition for when the inverse of A + A^(-1) actually exists.

    Correct me if I'm wrong but in my proof (assuming Q invertible) above, D has to be invertible and if A has an inverse B is also invertible so the proof should work under that assumption (I hope!).

    What are the conditions for Q being invertible?
     
  11. Mar 20, 2012 #10

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    Q is invertible for almost all matrices. See http://en.wikipedia.org/wiki/Diagonalizable_matrix

    It is certainly invertible for my example.
    ## A = QBQ^{-1}##
    ## \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}
    = \begin{bmatrix} 1 & i \\ i & 1 \end{bmatrix}
    \begin{bmatrix} i & \\ & -i \end{bmatrix}
    \begin{bmatrix} 1/2 & -i/2 \\ -i/2 & 1/2 \end{bmatrix}##
     
  12. Mar 20, 2012 #11

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    (My bold).

    Your mistake is assuming that the eigenvalues are real. If A is a general real matrix, the eigenvalues may be complex, as in my example.
     
  13. Mar 20, 2012 #12

    chiro

    User Avatar
    Science Advisor

    Ahh ok. Thanks for that! :). I'll have to keep that in mind for the future since a lot of the problems that I've encountered usually focus on them being real.

    This means that you have outlined the conditions for the OP's problem in a general sense.
     
  14. Mar 20, 2012 #13

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    They are always real for some important types of matrix, e.g. Hermitian matrices (the eigenvalues are always real even though the matrix elements may be complex). Real symmetric matrices are a special case of Hermitian.

    Actually my example was invented from the OP's original (and correct) argument. If an eigenvalue of ##A^2## is -1, then an eigenvalue of A must be ##\pm i##. The 1x1 matrix with element ##i## would do as a counterexample, but it's well known that the complex number ##a+ib## behaves the same way as the 2x2 real matrix
    ##\begin{bmatrix} a & b \\ -b & a \end{bmatrix}##.
     
  15. Mar 21, 2012 #14
    yeah :) think i had done a mistake in the last statement. The eigen value of A corresponding the the required condition is +/- (square root of -1 ) = +/ - (i)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook