# I Centre of an Algebra ... and Central Algebras ...

1. Nov 27, 2016

### Math Amateur

I am reading Matej Bresar's book, "Introduction to Noncommutative Algebra" and am currently focussed on Chapter 1: Finite Dimensional Division Algebras ... ...

I need help with some remarks of Bresar on the centre of an algebra ...

Commencing a section on Central Algebras, Bresar writes the following:

In the above text we read the following:

" ... The center of a unital algebra obviously contains scalar multiples of unity ... ... "

Now the center of a unital algebra $A$ is defined as the set $Z(A)$ such that

$Z(A) = \{ c \in A \ | \ cx = xc \text{ for all x } \in A \}$

Now ... clearly $1 \in Z(A)$ since $1x = x1$ for all x ...

BUT ... why do elements like $3$ belong to $Z(A)$ ... ?

That is ... how would we demonstrate that $3x = x3$ for all $x \in A$ ... ?

Hope someone can help ...

Peter

#### Attached Files:

• ###### Bresar - 1 - Central Algebras - PART 1 ... ....png
File size:
62.7 KB
Views:
162
2. Nov 27, 2016

### Staff: Mentor

For every vector space $V$ over a field $\mathbb{F}$ holds: $c\cdot v = v \cdot c$ for all $c \in \mathbb{F}$ and all $v \in V$.
And algebras are vector spaces. It is the same argument you used with the real numbers and the division algebra $D$ over $\mathbb{R}$.
This means that for algebras $\mathcal{A}$ even $c\cdot (v\cdot w)=(v \cdot c) \cdot w = v \cdot (c \cdot w) = (v\cdot w) \cdot c$ for all $c\in \mathbb{F}\, , \,v,w \in \mathcal{A}$ holds (by definition). Other than modules which can be left- or right-modules, a vector space is always a left- and right-module. One can weaken the requirement of $\mathbb{F}$ being a ring instead of a field, but then we talk of modules rather than vector spaces. However, this doesn't change the rule for scalar multiplication. If algebras are considered over a ring, this still is required. But in general the scalars are taken from a field unless otherwise explicitly stated.

3. Nov 27, 2016

### Math Amateur

Thanks for the help, fresh_42 ...

But ... I am not completely following you ... sorry to be slow ...

You write:

" ... ... For every vector space $V$ over a field $\mathbb{F}$ holds: $c\cdot v = v \cdot c$ for all $c \in \mathbb{F}$ and all $v \in V$. ... ... "

But why exactly is this true ...? it does not seem to be one of the axioms ... see below ...

The axioms for a vector space as given in Cooperstein: Advanced Linear Algebra (Second Edition) are given below ...

Again ... sorry if I'm missing something obvious ..

Peter

#### Attached Files:

• ###### Cooperstein - Axioms for a vector space.png
File size:
91.2 KB
Views:
201
4. Nov 28, 2016

### haruspex

I think you may be confusing the unit element of the field with the unit element of the algebra.
In a unital algebra (V, F), let I be the unit of V. A scalar multiple of that is λI, where λ∈F. Commutativity of that with x∈V would look like (λI)x=x(λI). We don't worry about λ commuting with elements of V because the scalar product is always written the same way around. I.e. xλ does not need to be defined.

5. Nov 28, 2016

### Staff: Mentor

No need for a sorry here. It is in fact a good question. As I've always thought of vectors as little arrows where scalar multiples are only a stretch or compression of them, I never really thought about a difference of a left-stretch and a right-stretch. Other than groups, modules, rings and algebras where the distinction between left and right comes along with the definition, this is not the case here.

This is what I have found:

van der Waerden speaks of left and right vector spaces and distinguishes two associative laws $(M3)\; (ab)u=a(bu)$ and $(M3^*)\; u(ab)=(ua)b$. He comments:
Unfortunately he doesn't explain, whether this is a convention to identify both isomorphic vector spaces $V_\mathbb{F}$ and ${}_\mathbb{F}V\, ,$ i.e. an additional axiom would be needed, or whether it is forced by the commutativity of $\mathbb{F}.$

Another source (about didactic) mentions, that the distinction has first been made by Bourbaki in 1947, but I did not track this down to quote a proper source.

The definitions I have found are all the same as yours above. And like me, nobody didn't really waste a thought about left and right, except my quote of van der Waerden above. I suppose that the fact, that both would lead to basically the same vector space (isomorphism), they didn't regard it as necessary. I've tried to proof $au=ua$ for (commutative) fields $\mathbb{F}$ but haven't found a solution quickly. It is clear that one has to be careful with non commutative (skew) fields, for associativity ($(M3)$, resp.$(M3^*)$) would get us into trouble if we set $u(ab)=(ab)u=a(bu)=a(ub)=(ub)a=u(ba) \neq u(ab) ,$ which is no danger for commutative fields. My suggestion is: As long as we don't have a better idea (or proof, or someone, who really knows), we should take it like an additional (even though unspoken) axiom, for otherwise the entire linear algebra would be unnecessarily (and probably quite distracting) overloaded by lefts and rights.
I know this isn't a satisfactory view of the issue, and I would definitely prefer a proof, but I think, the idea of a vector being stretched by the same factor gave different results from left and right is even more troublesome.
______________________
(1) https://www.amazon.com/Algebra-I-B-...36240&sr=1-1&keywords=van+der+waerden+algebra

Last edited: Nov 28, 2016
6. Nov 28, 2016

### haruspex

Not sure whether you saw/understood my post. In the standard axioms of an algebra, right-multiplication by scalars, i.e.xλ, is not even defined.

Let the algebra be (V, F), I be a unit in V, and λ, μ∈F.
For any x∈V, I.x=x.I.
The compatibility axiom says that if y∈V then (λμ)(x.y)=(λx).(μy).
Thus (λI).x=λ(I.x)=λ(x.I)=x.(λI).
Thus λI commutes with all elements of V.

No doubt you could define right-multiplication by scalars, and there would be no requirement for it to equate to λx.

7. Nov 28, 2016

### Staff: Mentor

Nevertheless, I found it interesting to ask, why we use $\lambda \cdot v = v \cdot \lambda$ in vector spaces without to mention this convention. I looked it up in two different books and it wasn't in there. But I didn't check, whether it is needed or not in the rest of the books. van der Waerden's and Bourbakis remarks on the issue at least showed, that it is not at all self-evident.
And I've been a little bit in the mood of an earlier thread, in which Bresar used the free positioning of reals in a proof about division algebras over $\mathbb{R}$ (if I remember it correctly; not quite sure, if it was really needed).

8. Nov 29, 2016

### Math Amateur

Thanks for the help haruspex ... ...

Just a clarification ... you write:

" ... ... Let the algebra be $(V, F)$, $I$ be a unit in $V$, and $λ, μ∈F$.... "

I am assuming that you mean $I$ is the unit (or unity or multiplicative identity) in $V$ ... ... and not simply a unit in $V$ ... is that correct?

You also write:

" ... ... right-multiplication by scalars, i.e.$xλ$, is not even defined. ... "

and yet we do not speak of left and right vector spaces over fields ... ... so surely $x \lambda = \lambda x$ in some sense ...???

I looked up some relevant texts to look for insights on this question ... it may be that Bland's (Paul E. Bland: "Rings and Their Modules") description of how to turn a right module where the action is described by a binary operation $M \times R \rightarrow M$ such that $(x,a) \rightarrow xa$ ... into a left module by setting $a \cdot x = xa$ ... ... is relevant ...

The relevant text from Paul E. Bland: "Rings and Their Modules" is as follows:

So my conclusion is that Bland is saying that when the ring is commutative there is essentially no difference between a right and a left module ...

I must say that I would have preferred an axiom that somehow directly implied that $ax = xa$ ...

What do you think ...?

Peter

#### Attached Files:

File size:
105.9 KB
Views:
100
• ###### Bland - 2 - Section 1.4 - Scalars acting on module or vector space elements - PART 2 ... .png
File size:
101.4 KB
Views:
96
9. Nov 30, 2016

### Math Amateur

Hi haruspex ... just another few clarifications I hope you can help with ...

You write:

" ... ... The compatibility axiom says that if $y∈V$ then $(λμ)(x.y)=(λx).(μy)$. ... ... "

In the texts I have checked lately the compatibility axiom reads something like:

$\lambda(xy) = (\lambda x)y = x(\lambda y)$ for all $\lambda\in F$ and $x,y\in V.$

In other words there is only one scalar mentioned in the axiom ... ... why have you included two scalars ...

You also write:

" ... ...
Thus $(λI).x=λ(I.x)=λ(x.I)=x.(λI)$.
Thus $λI$ commutes with all elements of $V$."

Your derivation of the fact that $λI$ commutes with all elements of $V$ involves assuming that $I.x = x.I$ ... but why exactly is this true ...

Indeed $I.x = x.I$ is not an axiom ... see below ... and I cannot see how to derive this from the axioms of a vector space ...

Can you help

Peter

======================================================================================================================

The axioms for a vector space V over a field F are given in Bruce N Cooperstein's book "Advanced Linear Algebra" (Second Edition) as follows:

#### Attached Files:

• ###### Cooperstein - Axioms for a vector space.png
File size:
91.2 KB
Views:
105
10. Nov 30, 2016

### haruspex

Yes.
Not really. If you look through the axioms for an algebra, the product of a scalar with a vector is always written the same way around. We are accustomed to algebras in which defining the other product to be the same creates no difficulty, so writing it either way around is harmless. But I bet that we never need to write it the other way.
No, he wrote that only if it is commutative can you elect to define the product such that it commutes. This leaves open the possibility of defining left and right algebras (or modules) to be different even though the scalars commute.

11. Nov 30, 2016

### haruspex

There are probably several equivalent ways of writing that axiom. I picked it off a random website. In fact, I don't think I used the other scalar when I applied it.
The topic here is unitary algebras, i.e. the vector product operation has a unit, I. By definition, I.x = x = x.I.

12. Nov 30, 2016

### Math Amateur

Thanks haruspex ... appreciate your help ...

Peter

13. Nov 30, 2016

### haruspex

You are welcome, and thanks for bringing up the subject. I had never heard of unitary algebras. My pure maths education did not go that far.

14. Nov 30, 2016

### Math Amateur

fresh_42 ...Thank so for all your help on this issue ...

Peter

15. Dec 1, 2016

### Staff: Mentor

Hi Peter,
I have another point on the issue. If we consider vectors written in some basis, such that we have coordinates. Then we write a vector $v=(v_1,v_2, \ldots)$ and $\lambda v = (\lambda v_1,\lambda v_2, \ldots)$. The $v_i$ are all elements of the field, and therefore $\lambda v_i = v_i \lambda$ holds, which turns into $\lambda v = v \lambda$ for the entire vector. Hence any different definition of left and right multiplication with scalars would get very problematic.

16. Dec 2, 2016

### Math Amateur

Hi fresh_42 ... well! ... most interesting ...

It explains why we don't talk about left and right vector spaces ... based on your analysis, they are both the same ...

Just a point that worried me ...

You write:

" ... ... The $v_i$ are all elements of the field, ... ... "

Rather than being elements of the field, $F$, the $v_i$ seem to me to be elements of the vector space ...

But, anyway, I agree with everything else you wrote ... and it is most illuminating ... thank you ...

Maybe your analysis should be in textbook presentation of vector spaces ... especially those books that are at senior undergraduate and beginning graduate levels ...

Peter

17. Dec 2, 2016

### Staff: Mentor

No, I meant the $v_i$ being the coordinates.

Let's consider for simplicity a single vector $\vec{b_1}$. This is a basis vector for a one-dimensional vector space $V = \mathbb{F}\cdot \vec{b_1}$. Then an arbitrary vector can be written as $\vec{w_{}}=v_1\cdot \vec{b_1}$ with $v_1 \in \mathbb{F}$ being the coordinate according to the basis $\{\vec{b_1}\}$. We now usually write $\vec{w_{}}=(v_1)$ for short.

Now let us assume for a moment that $\lambda \vec{w_{}} \neq \vec{w_{}} \lambda$. Then in our (usual) notation in coordinates we would have $\lambda \vec{w_{}} = (\lambda v_1) \neq (v_1 \lambda) = \vec{w_{}} \lambda$ which is strange, because the numbers $\lambda$ and $v_1$ do commutate as elements of $\mathbb{F}$.

This is not a proof, that $\lambda \vec{w_{}} = \vec{w_{}} \lambda$, but a reason that it makes sense, for otherwise we would have to say goodbye to our convenient notation like e.g. $\vec{u_{}} =(1,3,0,-1) \in \mathbb{R}^4$.

18. Dec 2, 2016

### Math Amateur

Thanks for the explanation fresh_42 ...

That really clarified things ...

Thanks again ...

Peter