- 29,099
- 20,722
Please everyone accept my apologies for my posts on this thread. I completely missed that we were talking about rectangular matrices!
Your point about the non-square matrices makes me wonder about my simple "proof" (post #50) which seems to say that two matrices with different dimensions are equal. I will have to think about that.PeroK said:Please everyone accept my apologies for my posts on this thread. I completely missed that we were talking about rectangular matrices!
How is ##A^{-1}## defined? The definition is: a matrix, if it exists, such that ##AA^{-1}=A^{-1}A=I##. But which identity if we don't know there is only one!Demystifier said:An identity matrix ##I## is a matrix having the property ##IA=A## for every ##A##. (Here I assume that matrices are squared and have a fixed dimension, so "every" means every squared matrix with that fixed dimension.)To prove that ##I## is unique, the idea is to assume the opposite and derive a contradiction. So let as assume that it's not unique, in which case we have two different matrices ##I_1## and ##I_2## having the properties ##I_1A=A## and ##I_2A=A## for every ##A##, which implies
$$(I_1-I_2)A=A-A=0$$
for every ##A##. Since it must be true for every ##A##, it follows in particular that it must be true for every invertible ##A##. But for invertible ##A## we can multiply this from the right with ##A^{-1}##, which gives
$$I_1-I_2=0$$
i.e. ##I_1=I_2##, which contradicts the initial assumption that ##I_1## and ##I_2## were different. Hence the initial assumption was wrong, which proves that ##I## is unique. Q.E.D.
Good point, here is a correct proof. The identity matrix is actually defined by two properties ##IA=A## and ##AI=A##, for every ##A##. Now suppose there are two such ##I##'s, namelymartinbn said:How is ##A^{-1}## defined? The definition is: a matrix, if it exists, such that ##AA^{-1}=A^{-1}A=I##. But which identity if we don't know there is only one!
Yes, that is in @FactChecker post #44.Demystifier said:Good point, here is a correct proof. The identity matrix is actually defined by two properties ##IA=A## and ##AI=A##, for every ##A##. Now suppose there are two such ##I##'s, namely
$$I_1A=AI_1=A$$
and
$$I_2A=AI_2=A$$
for every ##A##. Considering the cases ##A=I_2## in the first line and ##A=I_1## in the second, we get the equalities
$$I_1I_2=I_2I_1=I_2$$
and
$$I_2I_1=I_1I_2=I_1$$
which implies ##I_1=I_2##, Q.E.D.
I think you mean ##A(I_1 - I_2) = 0##.WWGD said:You may also argue, if ##AI_1=AI_2##, then ##A[I_1-A_2]=0##.
Ok, but just how are the two different? I'm not aware of any particular meaning of## [I_1- I_2].##.Mark44 said:I think you mean ##A(I_1 - I_2) = 0##.
WWGD said:Ok, but just how are the two different?
You wrote ##A(I_1 - A_2)## but I thought you meant ##A(I_1 - I_2)##.WWGD said:I'm not aware of any particular meaning of ##[I1−I2]##..
True, if we assume ##A## is invertible. Otherwise, its right kernel isn't trivial, i.e. , it's not just ##\{0\}##, so in that case of ##A## being singular, it doesn't follow that ##I_1=I_2##. But if ##A## is nonsingular, then you're right.Mark44 said:You wrote ##A(I_1 - A_2)## but I thought you meant ##A(I_1 - I_2)##.
If ##A(I_1 - I_2) = 0##, then using determinants you can deduce that ##I_1 = I_2##, which was the whole point in being able to say that the identity matrix must be unique.
From the beginning of this thread, it must be stated that the identities, ##I_1## and ##I_2##, work as identities for every possible ##A##. Otherwise, the conclusion that ##I_1=I_2## may be false.WWGD said:True, if we assume ##A## is invertible. Otherwise, its right kernel isn't trivial, i.e. , it's not just ##\{0\}##, so in that case of ##A## being singular, it doesn't follow that ##I_1=I_2##. But if ##A## is nonsingular, then you're right.
Ok, I guess I lost track of the " Initial Conditions". Using the Determinant alone ( assuming ##A## is square:FactChecker said:From the beginning of this thread, it must be stated that the identities, ##I_1## and ##I_2##, work as identities for every possible ##A##. Otherwise, the conclusion that ##I_1=I_2## may be false.
For the OP question it should not matter; uniqueness of the identity should be provable for any group (or at least any group that doesn't have some other weirdness like zero divisors). @FactChecker seems to me to have come up with the simplest derivation.PeroK said:I completely missed that we were talking about rectangular matrices!
The set of ##m \times n## matrices (where ##m \ne n##) do not form a multiplicative group. The usual matrix multiplication is not even well defined.PeterDonis said:For the OP question it should not matter; uniqueness of the identity should be provable for any group (or at least any group that doesn't have some other weirdness like zero divisors). @FactChecker seems to me to have come up with the simplest derivation.
This refers to multiplicative groups.PeroK said:The set of ##m \times n## matrices (where ##m \ne n##) do not form a multiplicative group. The usual matrix multiplication is not even well defined.
And in such cases there is no "identity" at all. So the OP question isn't even applicable.PeroK said:The set of ##m \times n## matrices (where ##m \ne n##) do not form a multiplicative group. The usual matrix multiplication is not even well defined.
An ##n \times n## matrix is a linear transformation on the set of ##n \times m## matrices (under left multiplication). The question related to the identity transformation in this context.PeterDonis said:And in such cases there is no "identity" at all. So the OP question isn't even applicable.