Why is scalar multiplication on vector spaces not commutative?

Click For Summary
Scalar multiplication in vector spaces is not commutative because it involves a scalar and a vector, where the scalar is typically applied to the vector from one side, usually the left. The concept of multiplicative inverses does not apply to scalar multiplication in the same way it does in standard arithmetic, as the identity element is a scalar while the result is a vector. While scalar multiplication can be defined in a commutative manner over fields, the distinction between left-modules and right-modules arises when dealing with non-commutative rings. In vector spaces, scalar multiplication is conventionally treated as left-multiplication, which leads to the conclusion that commutativity is not inherently required. Overall, the structure of vector spaces and modules allows for flexibility in defining scalar multiplication without necessitating commutativity.
honestrosewater
Gold Member
Messages
2,133
Reaction score
6
(Or if you prefer: Why are things defined this way?) I noticed that, in my book's definition, scalar multiplication (SM) on vector spaces lacks two familiar things: commutativity and inverses.

The multiplicative inverse concept doesn't seem to apply to SM. Can it? I can't imagine how it could because, for one thing, the multiplicative identity is a scalar and the product of SM is a vector. (Right? I can't find a definition that actually says that 1 is the identity, but that's what I take 1v = v for all v in V to mean. And is 1 meant to just be the multiplicative identity of the set over which the vector space is defined, whatever it happens to be?)

I guess that SM isn't required to be commutative because you want to be able to define vector spaces over different kinds of sets? But isn't SM commutative on vector spaces over fields? That is, for example, if Fn is a field, a is in F, and (x1, ..., xn) is in Fn, then I could define SM' as

a(x1, ..., xn) = (ax1, ..., axn) = (x1, ..., xn)a

and as long as everything else holds, Fn with SM' would be a vector space? Is commutativity for SM interpreted in another way? For example

ab(v) = ba(v)
 
Last edited:
Physics news on Phys.org
The general concept is that of a module, and there are two kinds: left-modules and right-modules, and you have a ring of scalars. (Rings need not be commutative, nor must they have inverses. The ring of all nxn matrices is a good example)

A left-module is one where scalar multiplication is written on the left, and a right-module is one where scalar multiplication is written on the right.

If m is an element of a right-module, and a, b are elements of the ring of scalars, then (ma)b = m(ab)

Generally, a right and left modules are different things. If we have a right module, where (ma)b = m(ab), and we try to write it as left-multiplication, then we'd have b(am) = (ab)m. :frown:

But when the ring of scalars is commutative, the concepts of left-modules and right-modules are isomorphic, since we can define ma := am to convert a left-module into a right-module.

By convention, we write vector spaces as left-modules, so only av is defined when a is a scalar, and v is a vector, but we could define va if we wanted, when over a field. However, when over a division ring (we have inverses, but not commutativity), we still call it a vector space, but there's a difference between a left and a right vector space. (If you want an example, take a vector space over the quaternions)
 
Sorry this is so late. That helped; thank you. :smile:
 
How in the world could scalar multiplication be commutative?
Scalar multiplication is the product of a scalar and a vector- you can't interchange them. Of course, if just want to say that it doesn't matter how you write the product \lambda v= v \lambda where \lambda is a scalar and v is a vector, then that's trivially true but that is not what "commutative" means!
 
Last edited by a moderator:
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
8K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K