- #1

askmathquestions

- 65

- 6

- Homework Statement
- Prove the identity matrix is unique.

- Relevant Equations
- I_1 * A = A, I_2 * A = A

I would appreciate help walking through this. I put solid effort into it, but there's these road blocks and questions that I can't seem to get past. This is homework I've assigned myself because these are nagging questions that are bothering me that I can't figure out. I'm studying purely on my own, no professors, but according to freely accessible MIT open courseware material on linear algebra.

I'm trying to prove and figure out how and why that the identity matrix is unique, but I can't quite figure out how AC = BC implies A = B, I don't know how you can suddenly remove the "C" from the equation. Here's where I'm at, and I don't know if there's a LaTeX editor I can use:

Let $ I_1 $ and $ I_2$ be two $n$ x $n$ matrices acting on an $n$ x $p$ matrix $A$, such that $ I_1*A = A $ and $ I_2*A = A $. Suppose $A$ is not identically the $0$ matrix.

How do we show $ I_1 = I_2 $ ?

We have by equality that

$ I_2 * I_1 * A = I_2 * A = A $

and so $ I_1 * A = I_2 * A $

But how do I make the leap to saying $ I_1 = I_2$? Every other attempt I have is just some combinatoric mess of matrices, there's something fundamental I'm not getting and I don't know what.

If we made some additional assumptions of the framework, we could require $A$ is invertible, but then we'd lose the identity's uniqueness on non-invertible matrices.

This reminds me of another question that's bothering me: are column vectors, like x = [[x_1],[x_2],[x_3]] "invertible" matrices? Because conceivably, we could define a row vector y = (1/3) [[1/x_1, 1/x_2, 1/x_3 ]] so that when we multiply $ y*x $ we obtain 1, but I'm confused because historically we don't refer to vectors as "matrices", we refer to them as "vectors", so it's confusing to assume a vector is a matrix, and furthermore this "1" that is the byproduct of multiplying y and x is a just scalar quantity, not a matrix, so I don't know whether to say "y" is the "left-inverse" of x. I'm confused by how all the dimensions of each component keep changing.

I'm trying to prove and figure out how and why that the identity matrix is unique, but I can't quite figure out how AC = BC implies A = B, I don't know how you can suddenly remove the "C" from the equation. Here's where I'm at, and I don't know if there's a LaTeX editor I can use:

Let $ I_1 $ and $ I_2$ be two $n$ x $n$ matrices acting on an $n$ x $p$ matrix $A$, such that $ I_1*A = A $ and $ I_2*A = A $. Suppose $A$ is not identically the $0$ matrix.

How do we show $ I_1 = I_2 $ ?

We have by equality that

$ I_2 * I_1 * A = I_2 * A = A $

and so $ I_1 * A = I_2 * A $

But how do I make the leap to saying $ I_1 = I_2$? Every other attempt I have is just some combinatoric mess of matrices, there's something fundamental I'm not getting and I don't know what.

If we made some additional assumptions of the framework, we could require $A$ is invertible, but then we'd lose the identity's uniqueness on non-invertible matrices.

This reminds me of another question that's bothering me: are column vectors, like x = [[x_1],[x_2],[x_3]] "invertible" matrices? Because conceivably, we could define a row vector y = (1/3) [[1/x_1, 1/x_2, 1/x_3 ]] so that when we multiply $ y*x $ we obtain 1, but I'm confused because historically we don't refer to vectors as "matrices", we refer to them as "vectors", so it's confusing to assume a vector is a matrix, and furthermore this "1" that is the byproduct of multiplying y and x is a just scalar quantity, not a matrix, so I don't know whether to say "y" is the "left-inverse" of x. I'm confused by how all the dimensions of each component keep changing.

Last edited: