MHB Invertible Matrices: Why These Statements Are Not Correct

  • Thread starter Thread starter Yankel
  • Start date Start date
  • Tags Tags
    Matrices
Yankel
Messages
390
Reaction score
0
I have one more question, I have two matrices A and B, both squared and with the same order. And I have a scalar, a not equal to zero.

why are these statements not correct ?

1. If A and B are invertible, then
a\cdot (B^{-1}A^{-1}B)^{^{t}}
is not necessarily invertible

2. If A and B are invertible, then
a\cdot (A+B)
is not necessarily invertible, but if it is, it's inverse is the matrix
\frac{1}{a}\cdot (A^{-1}+B^{-1})

Thanks...
 
Physics news on Phys.org
Yankel said:
I have one more question, I have two matrices A and B, both squared and with the same order. And I have a scalar, a not equal to zero.

why are these statements not correct ?

1. If A and B are invertible, then
a\cdot (B^{-1}A^{-1}B)^{^{t}}
is not necessarily invertible

2. If A and B are invertible, then
a\cdot (A+B)
is not necessarily invertible, but if it is, it's inverse is the matrix
\frac{1}{a}\cdot (A^{-1}+B^{-1})

Thanks...

For the first one, I'm going to assume WLOG that $A$ and $B$ are $n\times n$ matrices. if $A$ and $B$ are invertible, then $\det(A)\neq 0$ and $\det(B)\neq 0$. Furthermore, if $a\neq 0$ is a scalar, then
\[\det (a\cdot(B^{-1}A^{-1}B)^t)=a^n\det((B^{-1}A^{-1}B)^t) = a^n \det(B^{-1}A^{-1}B)=\frac{a^n\det(B)}{\det(B)\det(A)}=\frac{a^n}{\det(A)}.\]
This tells me that we'll always have $a\cdot(B^{-1}A^{-1}B)^t$ invertible provided that $A$ and $B$ are invertible with a scalar $a\neq 0$.

For the second one, it's true that if $A$ and $B$ are invertible, it may so happen that $a\cdot (A+B)$ isn't invertible. For instance, take $A=I$, $B=-I$ and $a=1$, where $I$ is the identity matrix. Then both matrices are invertible, but $A+B=O$, where $O$ is the zero matrix, which is clearly not invertible.

If $a\cdot(A+B)$ was invertible, then it wouldn't have that form for the inverse. We can see this is the case since
\[\left(a\cdot(A+B)\right)\left(\frac{1}{a}\cdot(A^{-1}+B^{-1})\right)=AB^{-1}+BA^{-1}+2I\neq I\]I hope this helps!
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
21
Views
1K
Replies
2
Views
847
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K