Two ways to define operations in a vector space

Click For Summary
SUMMARY

This discussion explores the various ways to define operations in a vector space, specifically focusing on the vector space R². The participants highlight that while traditional definitions of vector addition and scalar multiplication exist, alternative definitions can also satisfy vector space properties. For instance, redefining addition as (x₁, y₁) + (x₂, y₂) = (x₁ + y₂, y₁ + x₂) is presented, although it fails to maintain commutativity. The conversation also touches on redefining addition as multiplication, leading to further exploration of the implications on vector space axioms.

PREREQUISITES
  • Understanding of vector space axioms and properties
  • Familiarity with operations in linear algebra
  • Knowledge of matrix operations and their implications
  • Basic concepts of bijections and set theory
NEXT STEPS
  • Research alternative definitions of vector operations in linear algebra
  • Explore the implications of non-standard operations on vector spaces
  • Study the properties of bijections in relation to vector spaces
  • Investigate the role of scalar multiplication in vector space definitions
USEFUL FOR

Mathematicians, students of linear algebra, and educators looking to deepen their understanding of vector space operations and their flexibility in definitions.

Santiago24
Messages
32
Reaction score
6
Hi PF, I've one question about vector spaces. There is only one way to define the operations of a vector space? For example if V is a vector space there is other way to define their operations like scalar multiplication or the sums of their elements and that the result is also a vector space?
 
Physics news on Phys.org
Sure. For example, there is a set theoretic bijection between ##\mathbb{Q}## and ##\mathbb{Q}^2##, since they are both countable. So you could take elements of ##\mathbb{Q}##,l and define addition/multiplication by mapping them to ##\mathbb{Q}^2##, do you normal addition/multiplication there, then mapping the result back to ##\mathbb{Q}##. The result is turning ##\mathbb{Q}## into a two dimensional vector space, but the definition of addition and multiplication are going to be some gibberish function you've never seen before (they probably won't even be continuous in the normal topology)
 
Last edited:
  • Like
Likes   Reactions: Santiago24
I think there are probably infinite ways on how exactly we can define the two operations in such a way to satisfy the properties of a vector space. For example let's take the vector space ##R^2##. I think we can "redefine" the operation of addition as $$(x_1,y_1)\mathbf{+}(x_2,y_2)=(x_1+y_2,y_1+x_2)$$ (instead of ##(x_1+x_2,y_1+y_2)##.

I think ##R^2## under this new operation of addition and the usual multiplication by scalar, remains a vector space with neutral element (0,0) but the opposite of an element ##(x,y)## is not ##(-x,-y)## it is instead ##(-y,-x)## and everything else remains the same.
 
  • Like
Likes   Reactions: Santiago24
Delta2 said:
I think there are probably infinite ways on how exactly we can define the two operations in such a way to satisfy the properties of a vector space. For example let's take the vector space ##R^2##. I think we can "redefine" the operation of addition as $$(x_1,y_1)\mathbf{+}(x_2,y_2)=(x_1+y_2,y_1+x_2)$$ (instead of ##(x_1+x_2,y_1+y_2)##.

Vector addition must be commutative. This operation is not: <br /> (x_2, y_2) + (x_1, y_1) = (x_2 + y_1, y_2 + x_1) \neq (x_1, y_1) + (x_2, y_2). Let's consider what conditions we need in order for \mathbf{x}_1 \oplus \mathbf{x_2} \equiv A_1\mathbf{x}_1 + A_2\mathbf{x}_2 to be an abelian group on \mathbb{R}^2, where A_1 and A_2 are square matrices and operations on the right hand side have their usual meaning. (Your example has A_1 = I and A_2 = \begin{pmatrix} 0 &amp; 1 \\ 1 &amp; 0 \end{pmatrix}.)

Commutativity appears to require <br /> (A_1 - A_2)\mathbf{x} = 0 for all \mathbf{x} \in \mathbb{R}^2, so A_1 = A_2 = A.

For associativity we need <br /> A(A \mathbf{x}_1 + A\mathbf{x}_2) + A\mathbf{x}_3 = A\mathbf{x}_1 + A(A \mathbf{x}_2 + A \mathbf{x}_3) for every \mathbf{x}_1, \mathbf{x}_2 and \mathbf{x}_3. From this it follows that A^2 = A, so A is either the identity matrix or zero, and taking zero violates the inverse axiom.
 
  • Like
Likes   Reactions: Santiago24 and Delta2
Yes well, shortly after I posted #3 I realized that the redefined addition doesn't satisfy commutativity (and also probably not associativity) but I left my post as it was because as I said at start I believe there are infinite (if not infinite just too many ) ways to redefine the operations, I just presented one (failed) attempt to redefine addition.

what about if we redefine addition as multiplication, I mean $$(x_1,y_1)\oplus (x_2,y_2)=(x_1x_2,y_1y_2)$$ then this I think satisfies commutativity and associativity , the neutral element becomes (1,1) (instead of 0,0) and the opposite of (x,y) is (1/x,1/y). Is then ##R^2-(0,0)## a vector space under this redefined addition (and the usually defined multiplication by scalar)? What do you think @pasmith ?
 
  • Like
Likes   Reactions: Santiago24
Hm I think now it fails the axiom of distributivity $$\lambda (\mathbf{u}\oplus\mathbf{v})=\lambda\mathbf{u}\oplus\lambda\mathbf{v}$$. Probably we ll have to redefine multiplication by scalar too.

Well anyway I still believe there infinite ways to define the two operations of the vector space in such a way that the properties of a vector space hold.
 
  • Like
Likes   Reactions: Santiago24
Delta2 said:
Yes well, shortly after I posted #3 I realized that the redefined addition doesn't satisfy commutativity (and also probably not associativity) but I left my post as it was because as I said at start I believe there are infinite (if not infinite just too many ) ways to redefine the operations, I just presented one (failed) attempt to redefine addition.

what about if we redefine addition as multiplication, I mean $$(x_1,y_1)\oplus (x_2,y_2)=(x_1x_2,y_1y_2)$$ then this I think satisfies commutativity and associativity , the neutral element becomes (1,1) (instead of 0,0) and the opposite of (x,y) is (1/x,1/y). Is then ##R^2-(0,0)## a vector space under this redefined addition (and the usually defined multiplication by scalar)? What do you think @pasmith ?

That fails: (x,0) has no inverse. You can avoid that by taking your set of vectors to be (\mathbb{R} \setminus \{0\})^2 \subsetneq \mathbb{R}^2 \setminus \{(0,0)\}, but you still have a problem with distributivity of scalar multiplication: a(\mathbf{x}_1 \oplus \mathbf{x}_2) = (ax_1x_2, ay_1y_2) \neq (a^2x_1x_2,a^2y_1y_2) = (a\mathbf{x}_1) \oplus (a\mathbf{x}_2) (and you would need to assign a meaning to 0\mathbf{x} since (0,0) is not one of your vectors).

Edit: a(x,y) = (x^a, y^a) might work if you restrict your vectors to (0,\infty)^2.
 
  • Like
Likes   Reactions: Santiago24 and Delta2
pasmith said:
Edit: a(x,y)=(xa,ya) might work if you restrict your vectors to (0,∞)2.
Yes this redefinition of scalar multiplication would fix the distributivity problem and also assign (1,1) the neutral element to ##0\mathbf{x}##. And I don't see any other problems.
 
  • Like
Likes   Reactions: Santiago24

Similar threads

  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 0 ·
Replies
0
Views
8K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 38 ·
2
Replies
38
Views
7K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K