Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Centre of an Algebra ... and Central Algebras ...

  1. Nov 27, 2016 #1
    I am reading Matej Bresar's book, "Introduction to Noncommutative Algebra" and am currently focussed on Chapter 1: Finite Dimensional Division Algebras ... ...

    I need help with some remarks of Bresar on the centre of an algebra ...

    Commencing a section on Central Algebras, Bresar writes the following:


    ?temp_hash=1ac92350ae17142169d205fba8955e8d.png




    In the above text we read the following:

    " ... The center of a unital algebra obviously contains scalar multiples of unity ... ... "


    Now the center of a unital algebra ##A## is defined as the set ##Z(A)## such that

    ##Z(A) = \{ c \in A \ | \ cx = xc \text{ for all x } \in A \}##


    Now ... clearly ##1 \in Z(A)## since ##1x = x1## for all x ...

    BUT ... why do elements like ##3## belong to ##Z(A)## ... ?

    That is ... how would we demonstrate that ##3x = x3## for all ##x \in A## ... ?

    Hope someone can help ...

    Peter
     

    Attached Files:

  2. jcsd
  3. Nov 27, 2016 #2

    fresh_42

    Staff: Mentor

    For every vector space ##V## over a field ##\mathbb{F}## holds: ##c\cdot v = v \cdot c## for all ##c \in \mathbb{F}## and all ##v \in V##.
    And algebras are vector spaces. It is the same argument you used with the real numbers and the division algebra ##D## over ##\mathbb{R}##.
    This means that for algebras ##\mathcal{A}## even ##c\cdot (v\cdot w)=(v \cdot c) \cdot w = v \cdot (c \cdot w) = (v\cdot w) \cdot c## for all ##c\in \mathbb{F}\, , \,v,w \in \mathcal{A}## holds (by definition). Other than modules which can be left- or right-modules, a vector space is always a left- and right-module. One can weaken the requirement of ##\mathbb{F}## being a ring instead of a field, but then we talk of modules rather than vector spaces. However, this doesn't change the rule for scalar multiplication. If algebras are considered over a ring, this still is required. But in general the scalars are taken from a field unless otherwise explicitly stated.
     
  4. Nov 27, 2016 #3


    Thanks for the help, fresh_42 ...

    But ... I am not completely following you ... sorry to be slow ...

    You write:

    " ... ... For every vector space ##V## over a field ##\mathbb{F}## holds: ##c\cdot v = v \cdot c## for all ##c \in \mathbb{F}## and all ##v \in V##. ... ... "

    But why exactly is this true ...? it does not seem to be one of the axioms ... see below ...

    The axioms for a vector space as given in Cooperstein: Advanced Linear Algebra (Second Edition) are given below ...



    ?temp_hash=8656ed063d2b4611e0b934806f72a219.png


    Could you please help further ...

    Again ... sorry if I'm missing something obvious ..

    Peter
     

    Attached Files:

  5. Nov 28, 2016 #4

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    I think you may be confusing the unit element of the field with the unit element of the algebra.
    In a unital algebra (V, F), let I be the unit of V. A scalar multiple of that is λI, where λ∈F. Commutativity of that with x∈V would look like (λI)x=x(λI). We don't worry about λ commuting with elements of V because the scalar product is always written the same way around. I.e. xλ does not need to be defined.
     
  6. Nov 28, 2016 #5

    fresh_42

    Staff: Mentor

    No need for a sorry here. It is in fact a good question. As I've always thought of vectors as little arrows where scalar multiples are only a stretch or compression of them, I never really thought about a difference of a left-stretch and a right-stretch. Other than groups, modules, rings and algebras where the distinction between left and right comes along with the definition, this is not the case here.

    This is what I have found:

    van der Waerden speaks of left and right vector spaces and distinguishes two associative laws ##(M3)\; (ab)u=a(bu)## and ##(M3^*)\; u(ab)=(ua)b##. He comments:
    Unfortunately he doesn't explain, whether this is a convention to identify both isomorphic vector spaces ##V_\mathbb{F}## and ##{}_\mathbb{F}V\, ,## i.e. an additional axiom would be needed, or whether it is forced by the commutativity of ##\mathbb{F}.##

    Another source (about didactic) mentions, that the distinction has first been made by Bourbaki in 1947, but I did not track this down to quote a proper source.

    The definitions I have found are all the same as yours above. And like me, nobody didn't really waste a thought about left and right, except my quote of van der Waerden above. I suppose that the fact, that both would lead to basically the same vector space (isomorphism), they didn't regard it as necessary. I've tried to proof ##au=ua## for (commutative) fields ##\mathbb{F}## but haven't found a solution quickly. It is clear that one has to be careful with non commutative (skew) fields, for associativity (##(M3)##, resp.##(M3^*)##) would get us into trouble if we set ##u(ab)=(ab)u=a(bu)=a(ub)=(ub)a=u(ba) \neq u(ab) ,## which is no danger for commutative fields. My suggestion is: As long as we don't have a better idea (or proof, or someone, who really knows), we should take it like an additional (even though unspoken) axiom, for otherwise the entire linear algebra would be unnecessarily (and probably quite distracting) overloaded by lefts and rights.
    I know this isn't a satisfactory view of the issue, and I would definitely prefer a proof, but I think, the idea of a vector being stretched by the same factor gave different results from left and right is even more troublesome.
    ______________________
    (1) https://www.amazon.com/Algebra-I-B-...36240&sr=1-1&keywords=van+der+waerden+algebra
     
    Last edited: Nov 28, 2016
  7. Nov 28, 2016 #6

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Not sure whether you saw/understood my post. In the standard axioms of an algebra, right-multiplication by scalars, i.e.xλ, is not even defined.

    Let the algebra be (V, F), I be a unit in V, and λ, μ∈F.
    For any x∈V, I.x=x.I.
    The compatibility axiom says that if y∈V then (λμ)(x.y)=(λx).(μy).
    Thus (λI).x=λ(I.x)=λ(x.I)=x.(λI).
    Thus λI commutes with all elements of V.

    No doubt you could define right-multiplication by scalars, and there would be no requirement for it to equate to λx.
     
  8. Nov 28, 2016 #7

    fresh_42

    Staff: Mentor

    Yes, you're right, that answers the question about the center.

    Nevertheless, I found it interesting to ask, why we use ##\lambda \cdot v = v \cdot \lambda## in vector spaces without to mention this convention. I looked it up in two different books and it wasn't in there. But I didn't check, whether it is needed or not in the rest of the books. van der Waerden's and Bourbakis remarks on the issue at least showed, that it is not at all self-evident.
    And I've been a little bit in the mood of an earlier thread, in which Bresar used the free positioning of reals in a proof about division algebras over ##\mathbb{R}## (if I remember it correctly; not quite sure, if it was really needed).
     
  9. Nov 29, 2016 #8



    Thanks for the help haruspex ... ...

    Just a clarification ... you write:

    " ... ... Let the algebra be ##(V, F)##, ##I## be a unit in ##V##, and ##λ, μ∈F##.... "

    I am assuming that you mean ##I## is the unit (or unity or multiplicative identity) in ##V## ... ... and not simply a unit in ##V## ... is that correct?


    You also write:

    " ... ... right-multiplication by scalars, i.e.##xλ##, is not even defined. ... "

    and yet we do not speak of left and right vector spaces over fields ... ... so surely ##x \lambda = \lambda x## in some sense ...???


    I looked up some relevant texts to look for insights on this question ... it may be that Bland's (Paul E. Bland: "Rings and Their Modules") description of how to turn a right module where the action is described by a binary operation ##M \times R \rightarrow M## such that ##(x,a) \rightarrow xa## ... into a left module by setting ##a \cdot x = xa## ... ... is relevant ...


    The relevant text from Paul E. Bland: "Rings and Their Modules" is as follows:


    ?temp_hash=3f198595098618f57614b32e4aef5312.png
    ?temp_hash=3f198595098618f57614b32e4aef5312.png

    So my conclusion is that Bland is saying that when the ring is commutative there is essentially no difference between a right and a left module ...

    I must say that I would have preferred an axiom that somehow directly implied that ##ax = xa## ...

    What do you think ...?


    Peter
     
  10. Nov 30, 2016 #9

    Hi haruspex ... just another few clarifications I hope you can help with ...

    You write:

    " ... ... The compatibility axiom says that if ##y∈V## then ##(λμ)(x.y)=(λx).(μy)##. ... ... "

    In the texts I have checked lately the compatibility axiom reads something like:

    ##\lambda(xy) = (\lambda x)y = x(\lambda y)## for all ##\lambda\in F## and ##x,y\in V.##

    In other words there is only one scalar mentioned in the axiom ... ... why have you included two scalars ...


    You also write:

    " ... ...
    Thus ##(λI).x=λ(I.x)=λ(x.I)=x.(λI)##.
    Thus ##λI## commutes with all elements of ##V##."


    Your derivation of the fact that ##λI## commutes with all elements of ##V## involves assuming that ##I.x = x.I## ... but why exactly is this true ...

    Indeed ##I.x = x.I## is not an axiom ... see below ... and I cannot see how to derive this from the axioms of a vector space ...

    Can you help

    Peter


    ======================================================================================================================


    The axioms for a vector space V over a field F are given in Bruce N Cooperstein's book "Advanced Linear Algebra" (Second Edition) as follows:


    ?temp_hash=7b64c9380ac73a05ff9ecb72f6316caf.png
     

    Attached Files:

  11. Nov 30, 2016 #10

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Yes.
    Not really. If you look through the axioms for an algebra, the product of a scalar with a vector is always written the same way around. We are accustomed to algebras in which defining the other product to be the same creates no difficulty, so writing it either way around is harmless. But I bet that we never need to write it the other way.
    No, he wrote that only if it is commutative can you elect to define the product such that it commutes. This leaves open the possibility of defining left and right algebras (or modules) to be different even though the scalars commute.
     
  12. Nov 30, 2016 #11

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    There are probably several equivalent ways of writing that axiom. I picked it off a random website. In fact, I don't think I used the other scalar when I applied it.
    The topic here is unitary algebras, i.e. the vector product operation has a unit, I. By definition, I.x = x = x.I.
     
  13. Nov 30, 2016 #12
    Thanks haruspex ... appreciate your help ...

    Peter
     
  14. Nov 30, 2016 #13

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    You are welcome, and thanks for bringing up the subject. I had never heard of unitary algebras. My pure maths education did not go that far.
     
  15. Nov 30, 2016 #14

    fresh_42 ...Thank so for all your help on this issue ...

    Peter
     
  16. Dec 1, 2016 #15

    fresh_42

    Staff: Mentor

    Hi Peter,
    I have another point on the issue. If we consider vectors written in some basis, such that we have coordinates. Then we write a vector ##v=(v_1,v_2, \ldots)## and ##\lambda v = (\lambda v_1,\lambda v_2, \ldots)##. The ##v_i## are all elements of the field, and therefore ##\lambda v_i = v_i \lambda## holds, which turns into ##\lambda v = v \lambda## for the entire vector. Hence any different definition of left and right multiplication with scalars would get very problematic.
     
  17. Dec 2, 2016 #16

    Hi fresh_42 ... well! ... most interesting ...

    It explains why we don't talk about left and right vector spaces ... based on your analysis, they are both the same ...


    Just a point that worried me ...

    You write:

    " ... ... The ##v_i## are all elements of the field, ... ... "

    Rather than being elements of the field, ##F##, the ##v_i## seem to me to be elements of the vector space ...


    But, anyway, I agree with everything else you wrote ... and it is most illuminating ... thank you ...

    Maybe your analysis should be in textbook presentation of vector spaces ... especially those books that are at senior undergraduate and beginning graduate levels ...

    Peter
     
  18. Dec 2, 2016 #17

    fresh_42

    Staff: Mentor

    No, I meant the ##v_i## being the coordinates.

    Let's consider for simplicity a single vector ##\vec{b_1}##. This is a basis vector for a one-dimensional vector space ##V = \mathbb{F}\cdot \vec{b_1}##. Then an arbitrary vector can be written as ##\vec{w_{}}=v_1\cdot \vec{b_1}## with ##v_1 \in \mathbb{F}## being the coordinate according to the basis ##\{\vec{b_1}\}##. We now usually write ##\vec{w_{}}=(v_1)## for short.

    Now let us assume for a moment that ##\lambda \vec{w_{}} \neq \vec{w_{}} \lambda##. Then in our (usual) notation in coordinates we would have ##\lambda \vec{w_{}} = (\lambda v_1) \neq (v_1 \lambda) = \vec{w_{}} \lambda ## which is strange, because the numbers ##\lambda## and ##v_1## do commutate as elements of ##\mathbb{F}##.

    This is not a proof, that ##\lambda \vec{w_{}} = \vec{w_{}} \lambda##, but a reason that it makes sense, for otherwise we would have to say goodbye to our convenient notation like e.g. ##\vec{u_{}} =(1,3,0,-1) \in \mathbb{R}^4##.
     
  19. Dec 2, 2016 #18
    Thanks for the explanation fresh_42 ...

    That really clarified things ...

    Thanks again ...

    Peter
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Centre of an Algebra ... and Central Algebras ...
  1. An Algebra. (Replies: 2)

  2. Algebraic Closure (Replies: 3)

Loading...