Is Simplifying Group Axioms to Just Left Identity and Inverses Valid?

  • Thread starter Thread starter Fredrik
  • Start date Start date
  • Tags Tags
    Identity Inverse
Click For Summary
The discussion explores whether a simplified definition of a group, using only left identity and left inverses, is valid. It proposes that a set G with a binary operation is a group if the operation is associative and every member has a left inverse, implying the existence of a left identity. The author grapples with proving that this definition encompasses all necessary group properties, particularly the existence of a right identity. They conclude that the definition must be supplemented to ensure that the left identity is consistent across all elements. The conversation also touches on notation conventions in group theory, specifically regarding the representation of the underlying set.
Fredrik
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
Messages
10,876
Reaction score
423
Consider a binary operation on a set G. A an element e of G is said to be a left identity if ex=x for all x. If x is in G, an element y of G is said to be a left inverse of x if yx is a left identity. A right identity and right inverse is defined similarly. Is the following an adequate definition of "group"?

The pair \big(G,(x,y)\mapsto xy\big) is said to be a group if
(1) the binary operation is associative,
(2) every member has a left inverse.

(By my definition of "left inverse", (2) implies that a left identity exists, so no need to mention that in a separate axiom). I have seen the claim that the group axioms that are usually written as ex=xe=x and x-1x=xx-1=e can be simplified to ex=x and x-1x=e without changing the meaning of the word "group", but I don't quite see how that can be sufficient. It's possible that I have weakened the definition too much by not explicitly saying that x-1x is the same left identity for all x. I'll have to add that as part of the definition if it can't be proved from the other axioms. Without including that as an axiom, I'm able to prove that
  • There's at most one right identity.
  • Every x has at most one right inverse.
  • If x-1 is a left inverse of x, then x is a left inverse of x-1.
  • If f is a right identity, then for any x, f is a right inverse of x-1x.
  • If x-1x has a right inverse, then x is a right inverse of x-1.
  • If x is a right inverse of x-1, then x-1 is a right inverse of x.
I'll show the calculations if someone asks for them. I don't see how to finish the proof that a "group" in the sense of the definition above, is actually a group. Looks like I only need to prove that there's a right identity.

This might be one of those times when I see the answer immediately after I post the question.
 
Last edited:
Physics news on Phys.org
I think there are models that aren't groups.

My idea is to adjust the free group on a set of symbols. We have left inverses so we can cancel off the left, so what if we can't cancel off the right? I want the rightmost symbol to pick out a class of words that are a group, but without the rightmost symbol itself being cancellable.


In the normal free group, the elements are words in the alphabet of generators and inverses of the generators, modulo the rewrite rules
xx' ->
x'x ->​
(the right hand sides are the null strings)

My plan is that you only allow these rewrite rules if they do not occur on the right edge of a word.

If I haven't made any errors, then each of the words of the form xx' and x'x are left identities, for x one of the generators, but all of them are unequal.
 
Reformulation of the same idea:

Let G be a group. Let X = |G| x |G|.

Define a product on X by
(a,b).(c,d) = (abc,d)​

This is associative:
((a,b).(c,d)).(e,f) = (abc,d).(e,f) = (abcde,f) = (a,b).(cde,f) = (a,b).((c,d).(e,f))​
the elements (x',x) are left identities:
(a,a').(b,c) = (aa'b,c) = (b,c)​
and (b',a') is the left inverse of (a,b):
(b',a').(a,b) = (b'a'a,b) = (b',b)​
 
Thank you very much. That answers it, and saves me a significant amount of time. There's no right identity in X, since (a,b)=(a,b).(c,d)=(abc,d) implies d=b. So my definition must at least be supplemented by the requirement that x-1x is the same left identity for all x. And it's not hard to see that this is sufficient. x-1x=e implies (xx-1)2=xx-1, which implies xx-1=e. So for arbitrary x, we have x-1x=e=xx-1, and this means that for arbitrary x, we have xe=x(x-1x)=(xx-1)x=ex=x. So the left identity is also a right identity.

By the way, I assume that the notation |G| means "the underlying set of G". Is that notation standard?
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 26 ·
Replies
26
Views
810
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
486
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 13 ·
Replies
13
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K