Multiplicative Identity under Matrix Multiplication

In summary, the conversation discusses the possibility of a group of singular matrices existing, despite the fact that singular matrices are non-invertible by definition. It is brought to attention that a multiplicative identity can be any matrix satisfying AI=IA=A, not just the typical diagonal "identity matrix". An example of such a group is given, and there is discussion about how almost every group can be represented by invertible matrices. The conversation ends with a conjecture about the existence of a stable subspace on which a group of matrices acts invertibly.
  • #1
rachbomb
4
0
I've been asked by my professor to identify a group of singular matrices. At first, I did not think this was possible, since a singular matrix is non-invertible by definition, yet to prove a groups existence, every such singular matrix must have an inverse.

It has been brought to my attention, however, that a multiplicative identity need not be the typical diagonal "identity matrix" but can instead be any matrix for which AI=IA=A.
For example, take the matrix
[3 3]
[0 0].
Since the determinant for this matrix is 0, it satisfies the singular aspect. However, my classmate is claiming that the multiplicative inverse for this matrix is
[1/3 1/3]
[0 0], which would indeed satisfy the above requirement of AI=IA=A.

Is this possible? Please help, I'm so confused!
 
Physics news on Phys.org
  • #2
A very simple example of a group of matrices under matrix multiplication is the set of all nxn matrices whose entries are all zero except for the upper-left entry, which is nonzero.
 
  • #3
I need to find an example of a group of singular, non-diagonal matrices under matrix multiplication. Does such a thing exist?
 
  • #4
Why is the multiplicative inverse allegedly acting like the multiplicative identity?

I think I see the group that your classmate was pointing towards. Multiplying your matrices
3 3
0 0

and

1/3 1/3
0 0

together we get a matrix

1 1
0 0

These matrices have something in common. Their bottom row is zero.
 
  • #5
You can get your friends and Hurkyl's examples to match by changing the basis.
eg
[tex]P = \begin{pmatrix}1&1\\0&-1\end{pmatrix}[/tex]
[tex]A = \begin{pmatrix}a&0\\0&0\end{pmatrix}[/tex]
[tex]P^{-1} A P = \begin{pmatrix}a&a\\0&0\end{pmatrix}[/tex]

I guess that nearly all examples you find will reduce like this.
Basically almost every group can be represented by invertible matrices.
The representation we have above is essentially the direct sum of an invertible rep and the group of one element
[tex] 0 = id [/tex]
 
  • #6
take any group of n-1 x n-1 matrices then extend them to dimension n by multiplying the last basis vector by zero.
 
  • #7
Conjecture: Given a group G of matrices acting on a vector space V, there is a subspace U of V that is stable under G (i.e. g(U) is contained in U for each g in G), and restricting the elements of G to U gives an injective homomorphism from G to GL(U).
 
  • #8
I don't know about the injective part but the invertible part is easy

Let U be the smallest vector subspace which is stable under action by G... this exists and is unique since if W,T are stable under G, G takes [tex]W\cap T[/tex] to both W and to T so it takes [tex]W\cap T[/tex] to [tex]W\cap T[/tex]. Then if the matrices are invertible on a stable subspace they must also be invertible on the restriction to this subspace so it suffices to only look at the smallest stable subspace. We will call this U

Claim: the identity element of G, call it I, acts as the identity on U. If we prove this, then we have proven that all the other elements of G are invertible when restricted to U. Suppose not. We know that I2=I so I is a projection onto a subspace of U, call it W. Claim: all elements A of G have A(U) is a subset of W (which means that U was not the smallest stable subspace since W is stable in particular). We know that IA=A and Au=IAu=Iv for some v in U. Iv is in fact in W so Au is in W.

Hence I acts as the identity on U and the group is in fact invertible linear transformations on U.Now let U be the largest subspace such that G is invertible linear transformations. If T and W are subspaces in which G acts invertibly (is that the right word?) on them, then G acts invertibly on T+W. Again we only need to confirm that the identity acts as the identity on T+W, but I(+w)=It+Iw=t+w. So U exists and is unique.

If the matrices all act differently, it will be on this subspace but I don't have a proof or counterexample to this
 
  • #9
My point about the injective homomorphism was that G could be regarded as a subgroup of GL(U); the goal was to characterize all groups of matrices on V.

If you chose U to be the smallest stable subspace, then U would clearly be 0, which is totally uninteresting. (In that case, the homomorphism G -> GL(U) would be trivial, since GL(U) is.)
 
  • #10
adriank said:
My point about the injective homomorphism was that G could be regarded as a subgroup of GL(U); the goal was to characterize all groups of matrices on V.

If you chose U to be the smallest stable subspace, then U would clearly be 0, which is totally uninteresting. (In that case, the homomorphism G -> GL(U) would be trivial, since GL(U) is.)

Right. That just shortens the proof a little bit. The idea is that you only pick U small to show that there exists a subspace in which every matrix acts invertibly on; then you switch gears and look at the largest such subspace. That most likely won't be 0. You can't say look at the largest such subspace without proving that subspaces on which G acts invertibly on exist though. I was just typing on the fly so it was a pretty ugly way to describe things

To shorten things up considerably:

Let U=I(V). Since I2=I, I truly acts as the identity on U, which means that every element A of G when restricted to U acts as an invertible transformation (with inverse being A-1 in G restricted to U). Then what we need to do is show that if A and B are in G and A,B act identically on U, they act identically off of U as well.
 
  • #11
I have a proof. Let e be the identity of G, and let U = e(V) be the image of e.

U is stable under G, since for g in G and u in U, we have g(u) = e(g(u)) is in U. And in fact, e acts as the identity on U, since for u in U, we have u = e(v) for some v in V, and e(u) = e(e(v)) = e(v) = u.

Now for g in G, let ρ(g) = g|U denote the restriction of g to U; since U is stable under g, we regard ρ(g) as a linear map from U to U. It's clear that for g and g' in G, we have ρ(gg') = ρ(g)ρ(g'). The claim is that ρ(g) is in GL(V). Indeed, if g-1 denotes the inverse of g in G, then ρ(g)ρ(g-1) = ρ(gg-1) = ρ(e) is the identity of GL(U), and likewise for ρ(g-1)ρ(g). Thus ρ is a group homomorphism from G to GL(U).

Finally, we must show that ρ is injective. So if g is in ker(ρ), then ρ(g) = ρ(e) so g acts as the identity on U. But then clearly g = e, since G is a group.

edit: Ahhh, you just posted before I did. But I showed ρ is injective.

The conclusion: If G is a group of operators acting on a space V, then G is isomorphic to a subgroup of GL(U) for some subspace U of V.
 
  • #12
g acts as the identity on U, but g could do something outside of U that's different than e. I don't think it can in finite dimensional cases from trying to construct an example but maybe some weird infinite dimensional situation can screw you up
 
  • #13
But you see: g = ge = e. If v is in V, then e(v) is in U, so g(e(v)) = e(v).
 

Related to Multiplicative Identity under Matrix Multiplication

1. What is the multiplicative identity under matrix multiplication?

The multiplicative identity under matrix multiplication is a special matrix that behaves like the number 1 in regular multiplication. When this matrix is multiplied with any other matrix, the result is the same as the original matrix. In other words, it does not change the values of the other matrix.

2. How is the multiplicative identity represented in matrix form?

The multiplicative identity is represented by the identity matrix, which is a square matrix with 1s on the main diagonal (from top left to bottom right) and 0s everywhere else. The size of the identity matrix depends on the dimensions of the other matrix it is being multiplied with.

3. Why is the multiplicative identity important in matrix multiplication?

The multiplicative identity is important because it allows us to use matrix multiplication to solve systems of linear equations. It is also a crucial property in defining matrix inverses and determinants.

4. How does the multiplicative identity affect the product of two matrices?

The multiplicative identity has a similar effect on matrix multiplication as the number 1 has on regular multiplication. When it is multiplied with another matrix, the resulting product will have the same values as the original matrix. This means that the identity does not change the result of the multiplication.

5. Can the multiplicative identity be applied to non-square matrices?

No, the multiplicative identity can only be applied to square matrices. This is because non-square matrices have different dimensions, and the identity matrix must have the same number of rows and columns as the other matrix in order to be multiplied with it.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
25
Views
1K
  • Linear and Abstract Algebra
Replies
17
Views
4K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Quantum Physics
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
454
  • Precalculus Mathematics Homework Help
2
Replies
58
Views
3K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
3
Views
1K
Back
Top