Prove the identity matrix is unique

AI Thread Summary
The discussion centers on proving the uniqueness of the identity matrix, with the user struggling to understand how the equation AC = BC implies A = B. They explore the implications of using different matrices A, including the zero matrix, and express confusion about the role of invertibility in their proof. Participants suggest that if A is invertible, one can multiply both sides of the equation by A's inverse to conclude that I_1 = I_2. The conversation also touches on the nature of vectors as matrices and their invertibility, clarifying that only square matrices can be invertible. Ultimately, the consensus is that the identity element must be unique under the right conditions.
  • #51
Please everyone accept my apologies for my posts on this thread. I completely missed that we were talking about rectangular matrices!
 
Physics news on Phys.org
  • #52
PeroK said:
Please everyone accept my apologies for my posts on this thread. I completely missed that we were talking about rectangular matrices!
Your point about the non-square matrices makes me wonder about my simple "proof" (post #50) which seems to say that two matrices with different dimensions are equal. I will have to think about that.
 
  • #53
Maybe I'm simplifying this too much but can't this done by a "simple proof by contradiction" (and with index notation)?

Let ##A## be a ##\left(n \times n \right)## (a square) matrix
Let ##B## be a ##\left(n \times p \right)## matrix
Let ##AB = B## which is a ##\left(n \times p \right)##

Therefore ##B_{ij} = \left(AB\right)_{ij} = A_{ik} B_{kj} ##.

Now suppose ##A_{ik} \neq \delta_{ik}## then what would that mean about ##B_{ij} \neq B_{ij}##. Therefore ## A_{ik} ## must equal ##\delta_{ik}##.

Did I dun goof?
 
  • #54
An identity matrix ##I## is a matrix having the property ##IA=A## for every ##A##. (Here I assume that matrices are squared and have a fixed dimension, so "every" means every squared matrix with that fixed dimension.)To prove that ##I## is unique, the idea is to assume the opposite and derive a contradiction. So let as assume that it's not unique, in which case we have two different matrices ##I_1## and ##I_2## having the properties ##I_1A=A## and ##I_2A=A## for every ##A##, which implies
$$(I_1-I_2)A=A-A=0$$
for every ##A##. Since it must be true for every ##A##, it follows in particular that it must be true for every invertible ##A##. But for invertible ##A## we can multiply this from the right with ##A^{-1}##, which gives
$$I_1-I_2=0$$
i.e. ##I_1=I_2##, which contradicts the initial assumption that ##I_1## and ##I_2## were different. Hence the initial assumption was wrong, which proves that ##I## is unique. Q.E.D.
 
  • #55
Demystifier said:
An identity matrix ##I## is a matrix having the property ##IA=A## for every ##A##. (Here I assume that matrices are squared and have a fixed dimension, so "every" means every squared matrix with that fixed dimension.)To prove that ##I## is unique, the idea is to assume the opposite and derive a contradiction. So let as assume that it's not unique, in which case we have two different matrices ##I_1## and ##I_2## having the properties ##I_1A=A## and ##I_2A=A## for every ##A##, which implies
$$(I_1-I_2)A=A-A=0$$
for every ##A##. Since it must be true for every ##A##, it follows in particular that it must be true for every invertible ##A##. But for invertible ##A## we can multiply this from the right with ##A^{-1}##, which gives
$$I_1-I_2=0$$
i.e. ##I_1=I_2##, which contradicts the initial assumption that ##I_1## and ##I_2## were different. Hence the initial assumption was wrong, which proves that ##I## is unique. Q.E.D.
How is ##A^{-1}## defined? The definition is: a matrix, if it exists, such that ##AA^{-1}=A^{-1}A=I##. But which identity if we don't know there is only one!
 
  • Like
Likes Demystifier
  • #56
martinbn said:
How is ##A^{-1}## defined? The definition is: a matrix, if it exists, such that ##AA^{-1}=A^{-1}A=I##. But which identity if we don't know there is only one!
Good point, here is a correct proof. The identity matrix is actually defined by two properties ##IA=A## and ##AI=A##, for every ##A##. Now suppose there are two such ##I##'s, namely
$$I_1A=AI_1=A$$
and
$$I_2A=AI_2=A$$
for every ##A##. Considering the cases ##A=I_2## in the first line and ##A=I_1## in the second, we get the equalities
$$I_1I_2=I_2I_1=I_2$$
and
$$I_2I_1=I_1I_2=I_1$$
which implies ##I_1=I_2##, Q.E.D.
 
  • #57
Demystifier said:
Good point, here is a correct proof. The identity matrix is actually defined by two properties ##IA=A## and ##AI=A##, for every ##A##. Now suppose there are two such ##I##'s, namely
$$I_1A=AI_1=A$$
and
$$I_2A=AI_2=A$$
for every ##A##. Considering the cases ##A=I_2## in the first line and ##A=I_1## in the second, we get the equalities
$$I_1I_2=I_2I_1=I_2$$
and
$$I_2I_1=I_1I_2=I_1$$
which implies ##I_1=I_2##, Q.E.D.
Yes, that is in @FactChecker post #44.
 
  • Informative
Likes Demystifier
  • #58
No additional structure is required. Given a set ##A## and a multiplication ##\cdot : A\times A\to A##. If there exists ##e\in A## such that ##ea=a=ae## for every ##a\in A##, then for any ##e'\in A## with this property, one has ##e=ee'=e'##. In short, the identity is unique.
 
  • #59
You may also argue, if ##AI_1=AI_2##, then ##A[I_1-A_2]=0##. In the most general sense, ##I_1-I_2## is in the right kernel of ##A##. If ##A## is an invertible matrix, this right kernel must be trivial. As Fresh_42 points, out , beyond that setting of invertible matrices, it depends on what type of object ##A## is.
 
  • #60
WWGD said:
You may also argue, if ##AI_1=AI_2##, then ##A[I_1-A_2]=0##.
I think you mean ##A(I_1 - I_2) = 0##.
 
  • Like
Likes FactChecker
  • #61
Mark44 said:
I think you mean ##A(I_1 - I_2) = 0##.
Ok, but just how are the two different? I'm not aware of any particular meaning of## [I_1- I_2].##.
 
  • #62
WWGD said:
Ok, but just how are the two different?

WWGD said:
I'm not aware of any particular meaning of ##[I1−I2]##..
You wrote ##A(I_1 - A_2)## but I thought you meant ##A(I_1 - I_2)##.
If ##A(I_1 - I_2) = 0##, then using determinants you can deduce that ##I_1 = I_2##, which was the whole point in being able to say that the identity matrix must be unique.
 
  • #63
Mark44 said:
You wrote ##A(I_1 - A_2)## but I thought you meant ##A(I_1 - I_2)##.
If ##A(I_1 - I_2) = 0##, then using determinants you can deduce that ##I_1 = I_2##, which was the whole point in being able to say that the identity matrix must be unique.
True, if we assume ##A## is invertible. Otherwise, its right kernel isn't trivial, i.e. , it's not just ##\{0\}##, so in that case of ##A## being singular, it doesn't follow that ##I_1=I_2##. But if ##A## is nonsingular, then you're right.
 
  • #64
WWGD said:
True, if we assume ##A## is invertible. Otherwise, its right kernel isn't trivial, i.e. , it's not just ##\{0\}##, so in that case of ##A## being singular, it doesn't follow that ##I_1=I_2##. But if ##A## is nonsingular, then you're right.
From the beginning of this thread, it must be stated that the identities, ##I_1## and ##I_2##, work as identities for every possible ##A##. Otherwise, the conclusion that ##I_1=I_2## may be false.
 
  • #65
FactChecker said:
From the beginning of this thread, it must be stated that the identities, ##I_1## and ##I_2##, work as identities for every possible ##A##. Otherwise, the conclusion that ##I_1=I_2## may be false.
Ok, I guess I lost track of the " Initial Conditions". Using the Determinant alone ( assuming ##A## is square:

## Det(A(I_1-I_2))= DetADet( I_1-I_2))=0## implies either of the determinants is ##0##.

Though ## Det( I_1-I_2)=0## doesnt imply ##I_1=I_2##.

But I admit I may have somewhat lost track of were the discussion veered.
 
  • #66
PeroK said:
I completely missed that we were talking about rectangular matrices!
For the OP question it should not matter; uniqueness of the identity should be provable for any group (or at least any group that doesn't have some other weirdness like zero divisors). @FactChecker seems to me to have come up with the simplest derivation.
 
  • #67
PeterDonis said:
For the OP question it should not matter; uniqueness of the identity should be provable for any group (or at least any group that doesn't have some other weirdness like zero divisors). @FactChecker seems to me to have come up with the simplest derivation.
The set of ##m \times n## matrices (where ##m \ne n##) do not form a multiplicative group. The usual matrix multiplication is not even well defined.
 
  • #68
PeroK said:
The set of ##m \times n## matrices (where ##m \ne n##) do not form a multiplicative group. The usual matrix multiplication is not even well defined.
This refers to multiplicative groups.
 
  • #69
PeroK said:
The set of ##m \times n## matrices (where ##m \ne n##) do not form a multiplicative group. The usual matrix multiplication is not even well defined.
And in such cases there is no "identity" at all. So the OP question isn't even applicable.
 
  • #70
PeterDonis said:
And in such cases there is no "identity" at all. So the OP question isn't even applicable.
An ##n \times n## matrix is a linear transformation on the set of ##n \times m## matrices (under left multiplication). The question related to the identity transformation in this context.
 
Back
Top