Please check my solution to this invertibility test

  • Thread starter vish_maths
  • Start date
  • Tags
    Test
In summary, the conversation discussed whether the sum of an invertible matrix A and its inverse A^(-1) is also invertible. The participants first established that if A is invertible, then A^-1 exists. They then used the eigenvalue decomposition to prove that A + A^(-1) is invertible as long as A has no zero eigenvalues. This is because for A + A^(-1) to be invertible, the sum of the eigenvalues of A and A^-1 must not equal zero. The conversation ended with a clarification that A does not necessarily need to be invertible for the eigenvalue decomposition to hold.
  • #1
vish_maths
61
1
Q. Given that A is invertible , check whether A + A^(-1) is invertible or not ?

A. A is invertible . => A^-1 exists

( sorry about not using the powers directly. i am not able to click on the required icon ).

Suppose a non trivial solution exists to ( A + A^(-1) ) X = 0
( which means let's assume A + A^(-1) is non - invertible ).

=> a non trivial solution exists to AX = A^(-1) ( -X)

or to AX = A A^(-2) ( -X)

or to A [ A^(-2) + I ] X = 0

clearly , [ A^(-2) + I ] X = Y is equal to 0 as A is invertible

i.e. [ A^(-2) + I ] X =0

We also know that a non trivial solution for X exists
=> [ A^(-2) + I ] is non - invertible.

=> A^(-2) must have an eigen value = -1
=> A also must have an eigen value = -1 ( by the theorem of eigen value decomp.)

=> A + A^(-1) is non-invertible only when the matrix A has an eigen value = -1.
Hence, generally speaking, it is not invertible.

Thanks a lot.
 
Last edited:
Physics news on Phys.org
  • #2
Hey vish_maths and welcome to the forums.

If Det(A + A^(-1)) is non-zero then A + A^(-1) must not be linear combinations of each other in terms of rows or columns. We know that A has this property and A^(-1).

From the eigenvalue decomposition we have:

A = QBQ^(-1) and A^-1 = QB^-1Q^-1 for our eigenvalue diagonal matrix. This implies that A + A^(-1) = D = QBQ^(-1) + QB^(-1)Q^(-1) = Q(BQ^(-1) + B^(-1)Q^(-1)). Now this has to be invertible. Now we know that Q is invertible so let's post multiply everything by Q:

DQ = Q(BQ^(-1) + B^Q(-1))Q = Q(BQ^(-1)Q + B^(-1)Q^(-1)Q) = Q(B + B^(-1)). Now premultiply by Q^-1 and we get Q^(-1)DQ = B^(-1) + B. Since Q is invertible and since D needs to be invertible we know that:

Det(Q^(-1)DQ) = Det(Q)Det(D)Det(Q^-1) = Det(D) = Det(B^(-1) + B) <> 0 has to hold.
Now the B matrix represents the eigenvalues as a diagonal matrix with only the eigenvalues.

This means that B^(-1) + B has along the diagonals e_m + 1/e_m for m = 1 to the Dim(A) where Dim(A) is the dimension of the matrix. For this to be invertible e_m + 1/e_m has to be non-zero. This implies e_m + 1/e_m <> 0 which implies e_m <> 1/e_m which implies (e_m)^2 <> 0 which implies that e_m is not 0.

So as long you don't have a zero eigenvalue you should be ok. But you only get a zero-eigenvalue if A is not invertible, which means that you will never encounter this situation which proves that A + A^(-1) is invertible provided A is invertible.

I might have made an error so please double check this for any bad assumptions or bad parts of the proof.
 
  • #3
chiro said:
Hey vish_maths and welcome to the forums.

If Det(A + A^(-1)) is non-zero then A + A^(-1) must not be linear combinations of each other in terms of rows or columns. We know that A has this property and A^(-1).

From the eigenvalue decomposition we have:

A = QBQ^(-1) and A^-1 = QB^-1Q^-1 for our eigenvalue diagonal matrix. This implies that A + A^(-1) = D = QBQ^(-1) + QB^(-1)Q^(-1) = Q(BQ^(-1) + B^(-1)Q^(-1)). Now this has to be invertible. Now we know that Q is invertible so let's post multiply everything by Q:

DQ = Q(BQ^(-1) + B^Q(-1))Q = Q(BQ^(-1)Q + B^(-1)Q^(-1)Q) = Q(B + B^(-1)). Now premultiply by Q^-1 and we get Q^(-1)DQ = B^(-1) + B. Since Q is invertible and since D needs to be invertible we know that:

Det(Q^(-1)DQ) = Det(Q)Det(D)Det(Q^-1) = Det(D) = Det(B^(-1) + B) <> 0 has to hold.
Now the B matrix represents the eigenvalues as a diagonal matrix with only the eigenvalues.

This means that B^(-1) + B has along the diagonals e_m + 1/e_m for m = 1 to the Dim(A) where Dim(A) is the dimension of the matrix. For this to be invertible e_m + 1/e_m has to be non-zero. This implies e_m + 1/e_m <> 0 which implies e_m <> 1/e_m which implies (e_m)^2 <> 0 which implies that e_m is not 0.

So as long you don't have a zero eigenvalue you should be ok. But you only get a zero-eigenvalue if A is not invertible, which means that you will never encounter this situation which proves that A + A^(-1) is invertible provided A is invertible.

I might have made an error so please double check this for any bad assumptions or bad parts of the proof.

hello chiro :)

Thanks for the reply.
The eigen value decomposition is actually of the form

AQ = QB
where Q represents the eigen vector matrix and B is the diagonal eigen value matrix.

A is invertible does not probably imply that it's eigen vector matrix too ought to be invertible.

Hence, i think A cannot be written in the form A = Q B Q^(-1) in all cases.

Please confirm and let's discuss in case there is any discrepancy.
Thanks :)
 
Last edited:
  • #4
vish_maths said:
hello chiro :)

Thanks for the reply.
The eigen value decomposition is actually of the form

AQ = QB
where Q represents the eigen vector matrix and B is the diagonal eigen value matrix.

A is invertible does not probably imply that it's eigen vector matrix too ought to be invertible.

Hence, i think A cannot be written in the form A = Q B Q^(-1) in all cases.

Please confirm and let's discuss in case there is any discrepancy.
Thanks :)

Yes it does. If A is invertible then it does not have any zero eigenvalue and since the eignevalue matrix is diagonal with the eigenvalues as the entries then it must be invertible if every diagonal entry is non-zero.

If you want more information: take a look at this:

http://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix#Eigendecomposition_of_a_matrix

This is where I got the formulas for the standard decomposition of A and its inverse.
 
  • #5
chiro said:
... So as long you don't have a zero eigenvalue you should be ok. But you only get a zero-eigenvalue if A is not invertible, which means that you will never encounter this situation which proves that A + A^(-1) is invertible provided A is invertible.

I might have made an error so please double check this for any bad assumptions or bad parts of the proof.

There must be something wrong here, because if
## A = \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} ##, then ## A^{-1} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} ##, and ## A + A^{-1} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix} ##
But I'm not feeling energetic enough right now to find your mistake :rolleyes:
 
  • #6
AlephZero said:
There must be something wrong here, because if
## A = \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} ##, then ## A^{-1} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} ##, and ## A + A^{-1} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix} ##
But I'm not feeling energetic enough right now to find your mistake :rolleyes:

Yeah you're right that there is a mistake (good example btw).

I'll have a look at it later when I have some free time.
 
  • #7
chiro said:
Yes it does. If A is invertible then it does not have any zero eigenvalue and since the eignevalue matrix is diagonal with the eigenvalues as the entries then it must be invertible if every diagonal entry is non-zero.

If you want more information: take a look at this:

http://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix#Eigendecomposition_of_a_matrix

This is where I got the formulas for the standard decomposition of A and its inverse.

hii chiro :)

I am not talking about the diagonalizability of the eigen value matrix. I meant that provided A is invertible, there is no guarantee that the eigen Vector matrix has to be invertible
In your case, you assumed Q to be invertible but we don't know whether Q is invertible or not, though in case of a non zero eigen value the eigen value matrix shall always be invertible :)

In short, we do not know whether the eigen vectors obtained are linearly independent or not .
so A can not be written in the format A = Q B Q^(-1)
because we don't know whether Q^(-1) exists or not :)

thanks
 
Last edited:
  • #8
AlephZero said:
There must be something wrong here, because if
## A = \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} ##, then ## A^{-1} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} ##, and ## A + A^{-1} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix} ##
But I'm not feeling energetic enough right now to find your mistake :rolleyes:

pleae have a look at my argument just above as well. Thanks :)
 
  • #9
vish_maths said:
pleae have a look at my argument just above as well. Thanks :)

Yeah you're both right. The proof assumes Q is invertible which means it all falls apart.

I guess what you could do however is find out when Q is actually invertible. That might help you get the condition for when the inverse of A + A^(-1) actually exists.

Correct me if I'm wrong but in my proof (assuming Q invertible) above, D has to be invertible and if A has an inverse B is also invertible so the proof should work under that assumption (I hope!).

What are the conditions for Q being invertible?
 
  • #10
Q is invertible for almost all matrices. See http://en.wikipedia.org/wiki/Diagonalizable_matrix

It is certainly invertible for my example.
## A = QBQ^{-1}##
## \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}
= \begin{bmatrix} 1 & i \\ i & 1 \end{bmatrix}
\begin{bmatrix} i & \\ & -i \end{bmatrix}
\begin{bmatrix} 1/2 & -i/2 \\ -i/2 & 1/2 \end{bmatrix}##
 
  • #11
chiro said:
This means that B^(-1) + B has along the diagonals e_m + 1/e_m for m = 1 to the Dim(A) where Dim(A) is the dimension of the matrix. For this to be invertible e_m + 1/e_m has to be non-zero. This implies e_m + 1/e_m <> 0 which implies e_m <> 1/e_m which implies (e_m)^2 <> 0 which implies that e_m is not 0.
(My bold).

Your mistake is assuming that the eigenvalues are real. If A is a general real matrix, the eigenvalues may be complex, as in my example.
 
  • #12
AlephZero said:
(My bold).

Your mistake is assuming that the eigenvalues are real. If A is a general real matrix, the eigenvalues may be complex, as in my example.

Ahh ok. Thanks for that! :). I'll have to keep that in mind for the future since a lot of the problems that I've encountered usually focus on them being real.

This means that you have outlined the conditions for the OP's problem in a general sense.
 
  • #13
They are always real for some important types of matrix, e.g. Hermitian matrices (the eigenvalues are always real even though the matrix elements may be complex). Real symmetric matrices are a special case of Hermitian.

Actually my example was invented from the OP's original (and correct) argument. If an eigenvalue of ##A^2## is -1, then an eigenvalue of A must be ##\pm i##. The 1x1 matrix with element ##i## would do as a counterexample, but it's well known that the complex number ##a+ib## behaves the same way as the 2x2 real matrix
##\begin{bmatrix} a & b \\ -b & a \end{bmatrix}##.
 
  • #14
yeah :) think i had done a mistake in the last statement. The eigen value of A corresponding the the required condition is +/- (square root of -1 ) = +/ - (i)
 

1. What is an invertibility test in science?

An invertibility test in science is a method used to determine whether a mathematical function or process is reversible. It involves checking if the input values can be uniquely mapped to the output values, or if there are any situations where the same output value can be achieved by different input values.

2. Why is it important to perform an invertibility test?

Performing an invertibility test is important because it ensures the accuracy and validity of a mathematical model or scientific process. If a function or process is not reversible, it can lead to errors and inaccurate results. Additionally, an invertibility test can help identify any problematic areas that may need to be addressed or improved upon.

3. What are some common methods used for invertibility testing?

There are various methods used for invertibility testing, including analytical methods such as solving equations and using mathematical principles, as well as numerical methods which involve using algorithms and simulations. Other methods may involve experimental testing or data analysis.

4. Can an invertibility test be performed in all scientific fields?

Yes, an invertibility test can be performed in all scientific fields that involve mathematical models or processes. This includes fields such as physics, chemistry, biology, engineering, and more. It is an important aspect of scientific research and experimentation.

5. Are there any limitations or challenges to performing an invertibility test?

Yes, there can be limitations or challenges to performing an invertibility test. These may include complex mathematical equations, lack of data or information, or the need for advanced computational tools. Additionally, some systems or processes may not be fully reversible, making it difficult to determine the exact degree of invertibility.

Similar threads

Replies
7
Views
828
  • Linear and Abstract Algebra
Replies
1
Views
730
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
4K
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
16
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
2K
Back
Top