Is a commutative A-algebra algebraic over A associative?

In summary: I don't know, ##A## itself, or ##A[\beta]##, or ##A[\beta_1]## (which is the same), or ##A[\beta_i]_{1\leq i\leq k}##, or if ##B## is finite-dimensional over ##A##, then probably ##A[\beta]## is an associative sub-algebra of ##B## over ##A##?In summary, the conversation discusses whether a commutative algebra ##B## over a ring ##A##, which is generated by algebraic elements ##\beta## over ##A##, is necessarily associative. The discussion involves using the Cauchy product to prove associativity by induction, and considering
  • #1
coquelicot
299
67
Let ##A## be a ring and ##B## be a commutative algebra over A·
Suppose that ##B## is generated by algebraic elements ##\beta\in B## over ##A##, meaning that ##\beta## fulfils a relation of the form ##P(\beta)=0##, with ##P\in A[X]##.
Is ##B## necessarily associative ?

NOTE: As usual, ##\beta^i## is defined inductively by ##\beta^i = \beta^{i-1}\beta##.
By commutativity, it follows that ##\beta\beta^i = \beta^i \beta = \beta^{i+1}##, hence by an evident induction, ##\beta^i\beta^j = \beta^{i+j}##.

I think that the answer to the question is YES because of the following reason:

We can suppose without loss of generality that ##B## is generated by a finite number of algebraic elements over ##A##.

Suppose first ##B=A[\beta]## (that is, ##B## is generated by 1 element over ##A##).
We have ##(\beta^i\beta^j)\beta^k = \beta^{i+j+k}=\beta^i(\beta^j\beta^k)##, hence using the Cauchy product, for polynomials in ##\beta## over ##A##, it follows that ##B## is associative.
If now ##B## has a finite number of algebraic generators, the result is true by an evident induction.
 
Last edited:
Physics news on Phys.org
  • #2
coquelicot said:
Let ##A## be a ring and ##B## be a commutative algebra over A·
Suppose that ##B## is generated by algebraic elements ##\beta\in B## over ##A##, meaning that ##\beta## fulfils a relation of the form ##P(\beta)=0##, with ##P\in A[X]##.
Is ##B## necessarily associative ?

NOTE: As usual, ##\beta^i## is defined inductively by ##\beta^i = \beta^{i-1}\beta##.
By commutativity, it follows that ##\beta\beta^i = \beta^i \beta = \beta^{i+1}##, hence by an evident induction, ##\beta^i\beta^j = \beta^{i+j}##.

I think that the answer to the question is YES because of the following reason:

We can suppose without loss of generality that ##B## is generated by a finite number of algebraic elements over ##A##.
I don't see this. If the rest is true, you still have only a countable degree.
Suppose first ##B=A[\beta]## (that is, ##B## is generated by 1 element over ##A##).
We have ##(\beta^i\beta^j)\beta^k = \beta^{i+j+k}=\beta^i(\beta^j\beta^k)##, hence using the Cauchy product, for polynomials in ##\beta## over ##A##, it follows that ##B## is associative.
If now ##B## has a finite number of algebraic generators, the result is true by an evident induction.
We have ##\alpha \cdot (\beta_1 \cdot \beta_2) = (\alpha \cdot \beta_1)\cdot \beta_2## and ##\alpha_1 \cdot (\beta \cdot \alpha_2) = (\alpha_1 \cdot \beta)\cdot \alpha_2\,## by definition. However, these conditions don't translate themselves automatically from ##A[\beta_1]## over ##A## to ##A[\beta_1,\beta_2]=A[\beta_1][\beta_2]## over ##A[\beta_1]## or in the next step to three. At least I don't see it immediately. I guess this is where commutativity will be needed.

I think I could construct a counterexample for non-Abelian ##B##, so let me think whether this is a crucial condition or not. If the induction is correct, which I'm not sure of, then your##\text{ w.o.l.g. }## probably does the job: ##A[\beta_\iota]_{\iota \in I}=A[\beta_\iota]_{\iota \in I- \{\iota_0\}}[\beta_{\iota_0}]## although you possibly need AC or even a transfinite induction, because the step from countable to uncountable has simply to be somewhere.
 
Last edited:
  • #3
fresh_42 said:
I don't see this. If the rest is true, you still have only a countable degree.
Yes, but any element ##x## of ##B## is a polynomial in a finite number of generators ##\beta## over ##A##, so, in the process of proving that ##x_1(x_2x_3) = (x_1x_2)x_3##, it suffices to consider the sub-algebra generated by suitably many generators involved in the expression of ##x_1,x_2, x_3##, which are in finite number.

fresh_42 said:
However, these conditions don't translate themselves automatically from ##A[\beta_1]## over ##A## to ##A[\beta_1,\beta_2]=A[\beta_1][\beta_2]## over ##A[\beta_1]## or in the next step to three. At least I don't see it immediately. I guess this is where commutativity will be needed.

Well, ##A## is a commutative ring, hence also a commutative pseudo-ring (that is, a ring without unity). Also, ##A[\beta_1]## has been shown to be a commutative pseudo-ring (since the associativity of the product has been, hopefully, shown, and the commutativity is clear). So, the same argument repeat with ##A[\beta_1]## in place of ##A## and ##A[\beta_1, \beta_2]## in place of ##A[\beta_1]## etc. I mean, the induction step could be: if ##A## is a commutative pseudo-ring and ##C## a commutative algebra over ##A## that is generated by a single algebraic element over ##A##, then ##C## is a commutative pseudo ring (with respect to the algebra product).
 
  • #4
coquelicot said:
I don't understand why you say "by definition" Isn't it what we are trying to prove?
The associativity with the scalars is AFAIK the ##A-##algebra definition. Thus it is open, whether ##A[\beta,\gamma]=A[\beta][\gamma]## is an ##A[\beta]-##algebra in this sense which is not automatically covered by induction. Or more direct: If induction works, how should we write this single induction step in terms that the algebra axiom "associativity" holds. I don't see ##\alpha (\beta \gamma) = (\alpha \beta ) \gamma## because we have the ##\alpha## part by definition for single elements ##\beta## but not how it decouples the product of two algebraic elements, i.e. stretching ##\beta \gamma## might be different from the product with a single one stretched first. Not quite sure, but here's where I would start to construct a counterexample. Although the community requirement is pretty strong, it cannot replace associativity.

Your argument about the countability is, that in all single cases we only have finitely many terms in the sums. Sounds o.k.
 
  • Like
Likes coquelicot
  • #5
fresh_42 said:
The associativity with the scalars is AFAIK the ##A-##algebra definition. Thus it is open, whether ##A[\beta,\gamma]=A[\beta][\gamma]## is an ##A[\beta]-##algebra in this sense which is not automatically covered by induction. Or more direct: If induction works, how should we write this single induction step in terms that the algebra axiom "associativity" holds. I don't see ##\alpha (\beta \gamma) = (\alpha \beta ) \gamma## because we have the ##\alpha## part by definition for single elements ##\beta## but not how it decouples the product of two algebraic elements, i.e. stretching ##\beta \gamma## might be different from the product with a single one stretched first. Not quite sure, but here's where I would start to construct a counterexample. Although the community requirement is pretty strong, it cannot replace associativity.

Your argument about the countability is, that in all single cases we only have finitely many terms in the sums. Sounds o.k.

You are right. Furthermore, I've realized that the assumption "##\beta## is algebraic over ##A##" has been nowhere used. Do you think, that the argument above proves at least that ##A[\beta]## is an associative sub-algebra of ##B## over ##A##?
 
  • #6
coquelicot said:
You are right. Furthermore, I've realized that the assumption "##\beta## is algebraic over ##A##" has been nowhere used. Do you think, that the argument above proves at least that ##A[\beta]## is an associative sub-algebra of ##B## over ##A##?
Sure, since we have associativity with the elements of ##A## by definition and all other elements are polynomials in ##\beta## so it extends to them via distributivity. But this should also hold for a transcendent ##\beta## as the individual elements are all finite sums.
 
  • Like
Likes coquelicot
  • #7
fresh_42 said:
Sure, since we have associativity with the elements of ##A## by definition and all other elements are polynomials in ##\beta## so it extends to them via distributivity. But this should also hold for a transcendent ##\beta## as the individual elements are all finite sums.
Thank you so many Fresh_42.
 

1. What is a commutative A-algebra?

A commutative A-algebra is a mathematical structure consisting of a set of elements, together with two operations (addition and multiplication) and a scalar multiplication by elements of a base ring A, such that the algebraic operations satisfy the commutative and associative properties.

2. What does it mean for an algebra to be algebraic over A?

An algebra A is said to be algebraic over A if every element in A is the root of a polynomial with coefficients in A. In other words, every element in A satisfies a polynomial equation with coefficients in A.

3. What is the difference between commutative and associative properties in an algebra?

The commutative property states that the order of elements in a binary operation does not affect the result, while the associative property states that the grouping of elements in a binary operation does not affect the result. In other words, for a commutative algebra, a * b = b * a, and for an associative algebra, (a * b) * c = a * (b * c).

4. How does an algebra being commutative and associative affect its properties?

The commutative and associative properties allow for simplification and manipulation of algebraic expressions, making it easier to solve equations and prove theorems. These properties are also important in the study of abstract algebra and its applications in various fields of mathematics.

5. Can an algebra be commutative but not associative, or vice versa?

Yes, it is possible for an algebra to be commutative but not associative, or associative but not commutative. For example, the set of real numbers under addition is commutative but not associative, while the set of 2x2 matrices under matrix multiplication is associative but not commutative.

Similar threads

  • Linear and Abstract Algebra
Replies
19
Views
1K
Replies
4
Views
2K
Replies
1
Views
548
  • Linear and Abstract Algebra
Replies
9
Views
1K
  • Computing and Technology
Replies
4
Views
767
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
101
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top