Uniqueness of Inverse Matrices: Proof and Explanation

Click For Summary
SUMMARY

The discussion centers on the uniqueness of inverse matrices, specifically addressing the equation $A \cdot A^{-1} = A^{-1} \cdot A = I$. It is established that this equation does not uniquely define the inverse when considering one-sided inverses, particularly in the context of non-square matrices. The participants clarify that while invertible matrices have unique inverses, the same does not hold for one-sided inverses, which can lead to multiple valid inverses. The confusion arises from the misapplication of properties of matrix multiplication and the nature of vectors as a special case of matrices.

PREREQUISITES
  • Understanding of matrix operations, specifically multiplication and inverses.
  • Familiarity with the concept of one-sided inverses in linear algebra.
  • Knowledge of vector spaces and the properties of dot products.
  • Basic understanding of groups in mathematics, particularly the general linear group.
NEXT STEPS
  • Study the properties of invertible matrices in linear algebra.
  • Learn about one-sided inverses and their implications in matrix theory.
  • Explore the concept of vector spaces and their dimensionality.
  • Investigate the structure of the general linear group and its significance in mathematics.
USEFUL FOR

Students of linear algebra, mathematicians exploring matrix theory, and educators teaching concepts related to matrix inverses and vector spaces.

ognik
Messages
626
Reaction score
2
I have an exercise which says to show that for vectors, $ A \cdot A^{-1} = A^{-1} \cdot A = I $ does NOT define $ A^{-1}$ uniquely.

But, let's assume there are at least 2 of $ A^{-1} = B, C$

Then $ A \cdot B = I = A \cdot C , \therefore BAB = BAC, \therefore B=C$, therefore $ A^{-1}$ is unique? (got lazy with dots)
 
Last edited by a moderator:
Physics news on Phys.org
ognik said:
I have an exercise which says to show that for vectors, $ A \cdot A^{-1} = A^{-1} \cdot A = I $ does NOT define $ A^{-1}$ uniquely.

But, let's assume there are at least 2 of $ A^{-1} = B, C$

Then $ A \cdot B = I = A \cdot C , \therefore BAB = BAC, \therefore B=C$, therefore $ A^{-1}$ is unique? (got lazy with dots)

Hi ognik, :)

I am not getting your question exactly. Do you mean that you have to show that $A^{-1}$ is unique? Could you please elaborate a bit.
 
I'm not sure to be honest, I wonder if it is more that the identity does not define the inverse uniquely, ie there could be more than 1 inverse ...
 
You'll have to be a bit more explicit in exactly what you're asking.

If, for a matrix $A$, there exists a matrix $B$ such that $AB = BA = I$, then $B$ is unique (invertible $n \times n$ matrices form a *group*, called the general linear group of degree $n$ over whatever field you're working with, and inverses in a group are unique).

However, if $B$ is merely a one-sided inverse, that is $AB = I$ or $BA = I$, but not both, $B$ may not be unique. This only happens when $A$ is non-square.

Furthermore, your question begins: "For vectors..." -it is hard to imagine what the (multiplicative) inverse of a vector might be.
 
It seems to me that a multiplicative inverse of a vector, is a vector such that the dot-product is equal to $1$.
So for instance:
$$(^2_3) \cdot (^{1/2}_0) = 1$$
 
I had what ILS suggests in mind, but I found the question hard to follow myself, hoping this is helpful - it is exactly as follows:

View attachment 4980
 

Attachments

  • vectorInverse.png
    vectorInverse.png
    1.8 KB · Views: 98
As the dot product is commutative, the fact that $A\cdot A^{-1}=A^{-1}\cdot A$ is superfluous. Choose
$$A=\begin{bmatrix}1 \\ -1\end{bmatrix}.$$
Choose
$$B=\begin{bmatrix}1 \\ 0\end{bmatrix} \qquad \text{and} \qquad C=\begin{bmatrix}2 \\ 1\end{bmatrix}.$$
Then $A\cdot B=1$ and $A\cdot C=1$, but $B\not=C$.
 
I see that thanks, but then what is wrong with my original 'proof'? It applies to matrices and a vector is a type of matrix?
 
ognik said:
I see that thanks, but then what is wrong with my original 'proof'? It applies to matrices and a vector is a type of matrix?

I presume you're referring to $BAB=BAC∴B=C$.

It appears you're removing $BA$ from the left side.
But that would correspond to multiplying on the left with $(BA)^{-1}$.
This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.
 
  • #10
You may or may not know this, but any vector $v$ in a finite-dimensional inner product space determines (uniquely) a *hyperplane* (a subspace of dimension: $\dim(V) - 1$):

$E_v = \{w \in V: \langle v,w\rangle = 0\}$.

The vector $\dfrac{1}{\|v\|}v$ serves as a (unit) *normal* to $E_v$, if we choose an orientation for $V$, we have that either:

$\dfrac{1}{\|v\|}v$ or $-\dfrac{1}{\|v\|}v$ can be called *the* normal to the hyperplane $E_v$ (the "usual" orientation for $\Bbb R^3$ is set so that $\mathbf{i} \times \mathbf{j} = \mathbf{k}$, often called the "right-hand" orientation, since it corresponds to forming the (positive) $x$-axis with the right-hand index finger, the (positive) $y$-axis with the right-hand middle finger, and the thumb (pointing up), is the (positive) $z$-axis. This is a purely arbitrary convention, which is why cross-products are often called "pseudo-vectors", their sign isn't independent of axis orientation).

So...where was I?

Pick any vector in $E_v$, say, $w$, and consider $w + v$.

Then $\langle w+v,v\rangle = \langle w,v\rangle + \langle v,v\rangle = 0 + \|v\|^2 \neq 0$ (unless $v = 0$).

Thus $\langle\dfrac{1}{\|v\|^2}(w+v),v\rangle = 1$, and it is clear we have just as many such vectors as we have elements of $E_v$.

Why does your result not hold when $\dim(V) = 1$?
 
  • #11
I like Serena said:
I presume you're referring to $BAB=BAC∴B=C$.

It appears you're removing $BA$ from the left side.
But that would correspond to multiplying on the left with $(BA)^{-1}$.
This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.
That's a surprise, I thought we effectively had (BA)B = (BA)C, therefore B must = C?
 
  • #12
ognik said:
I see that thanks, but then what is wrong with my original 'proof'? It applies to matrices and a vector is a type of matrix?

I like Serena said:
I presume you're referring to $BAB=BAC∴B=C$.

It appears you're removing $BA$ from the left side.
But that would correspond to multiplying on the left with $(BA)^{-1}$.
This can only be done if $BA$ is invertible and we cannot assume any such thing, and in fact we know that it's not.

I would also add that it's not at all clear that $BAB$ is well-defined. If the multiplication is a dot product, then the result of $A\cdot B$ is a number, not a vector; in that case, it's unclear what $B\cdot A \cdot B$ would even be. How would you define that?
 
  • #13
Ackbach said:
I would also add that it's not at all clear that $BAB$ is well-defined. If the multiplication is a dot product, then the result of $A\cdot B$ is a number, not a vector; in that case, it's unclear what $B\cdot A \cdot B$ would even be. How would you define that?
Indeed.

ognik said:
That's a surprise, I thought we effectively had (BA)B = (BA)C, therefore B must = C?
If $(BA)$ is for instance the zero matrix, we cannot conclude that $B=C$.

Just for fun:
$$a^2-a^2=a^2-a^2 \Rightarrow a(a-a) =(a+a)(a-a) \Rightarrow a=a+a \Rightarrow a=2a$$
This holds for any $a$, therefore $1=2$.
 
  • #14
I like Serena said:
Indeed.
If $(BA)$ is for instance the zero matrix, we cannot conclude that $B=C$.
.
I had postulated both B and C = $A^{-1}$ where we are a looking at A non-singular, but I need to remember that BA could be 0, even if B and A aren't.

Is $AA^{-1}=1$ by definition only?
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 26 ·
Replies
26
Views
983
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K