# Two ways to define operations in a vector space

• I
• Santiago24
In summary: That fails: (x,0) has no inverse. You can avoid that by taking your set of vectors to be (\mathbb{R} \setminus \{0\})^2 \subsetneq \mathbb{R}^2 \setminus \{(0,0)\}, but you still have a problem with distributivity of scalar multiplication: a(\mathbf{x}_1 \oplus \mathbf{x}_2) = (ax_1x_2, ay_1y_2) \neq (a^2x_1x_2,a^2y_1y_2) = (a\mathbf{x
Santiago24
Hi PF, I've one question about vector spaces. There is only one way to define the operations of a vector space? For example if V is a vector space there is other way to define their operations like scalar multiplication or the sums of their elements and that the result is also a vector space?

Sure. For example, there is a set theoretic bijection between ##\mathbb{Q}## and ##\mathbb{Q}^2##, since they are both countable. So you could take elements of ##\mathbb{Q}##,l and define addition/multiplication by mapping them to ##\mathbb{Q}^2##, do you normal addition/multiplication there, then mapping the result back to ##\mathbb{Q}##. The result is turning ##\mathbb{Q}## into a two dimensional vector space, but the definition of addition and multiplication are going to be some gibberish function you've never seen before (they probably won't even be continuous in the normal topology)

Last edited:
Santiago24
I think there are probably infinite ways on how exactly we can define the two operations in such a way to satisfy the properties of a vector space. For example let's take the vector space ##R^2##. I think we can "redefine" the operation of addition as $$(x_1,y_1)\mathbf{+}(x_2,y_2)=(x_1+y_2,y_1+x_2)$$ (instead of ##(x_1+x_2,y_1+y_2)##.

I think ##R^2## under this new operation of addition and the usual multiplication by scalar, remains a vector space with neutral element (0,0) but the opposite of an element ##(x,y)## is not ##(-x,-y)## it is instead ##(-y,-x)## and everything else remains the same.

Santiago24
Delta2 said:
I think there are probably infinite ways on how exactly we can define the two operations in such a way to satisfy the properties of a vector space. For example let's take the vector space ##R^2##. I think we can "redefine" the operation of addition as $$(x_1,y_1)\mathbf{+}(x_2,y_2)=(x_1+y_2,y_1+x_2)$$ (instead of ##(x_1+x_2,y_1+y_2)##.

Vector addition must be commutative. This operation is not: $$(x_2, y_2) + (x_1, y_1) = (x_2 + y_1, y_2 + x_1) \neq (x_1, y_1) + (x_2, y_2).$$ Let's consider what conditions we need in order for $$\mathbf{x}_1 \oplus \mathbf{x_2} \equiv A_1\mathbf{x}_1 + A_2\mathbf{x}_2$$ to be an abelian group on $\mathbb{R}^2$, where $A_1$ and $A_2$ are square matrices and operations on the right hand side have their usual meaning. (Your example has $A_1 = I$ and $A_2 = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}$.)

Commutativity appears to require $$(A_1 - A_2)\mathbf{x} = 0$$ for all $\mathbf{x} \in \mathbb{R}^2$, so $A_1 = A_2 = A$.

For associativity we need $$A(A \mathbf{x}_1 + A\mathbf{x}_2) + A\mathbf{x}_3 = A\mathbf{x}_1 + A(A \mathbf{x}_2 + A \mathbf{x}_3)$$ for every $\mathbf{x}_1$, $\mathbf{x}_2$ and $\mathbf{x}_3$. From this it follows that $$A^2 = A,$$ so $A$ is either the identity matrix or zero, and taking zero violates the inverse axiom.

Santiago24 and Delta2
Yes well, shortly after I posted #3 I realized that the redefined addition doesn't satisfy commutativity (and also probably not associativity) but I left my post as it was because as I said at start I believe there are infinite (if not infinite just too many ) ways to redefine the operations, I just presented one (failed) attempt to redefine addition.

what about if we redefine addition as multiplication, I mean $$(x_1,y_1)\oplus (x_2,y_2)=(x_1x_2,y_1y_2)$$ then this I think satisfies commutativity and associativity , the neutral element becomes (1,1) (instead of 0,0) and the opposite of (x,y) is (1/x,1/y). Is then ##R^2-(0,0)## a vector space under this redefined addition (and the usually defined multiplication by scalar)? What do you think @pasmith ?

Santiago24
Hm I think now it fails the axiom of distributivity $$\lambda (\mathbf{u}\oplus\mathbf{v})=\lambda\mathbf{u}\oplus\lambda\mathbf{v}$$. Probably we ll have to redefine multiplication by scalar too.

Well anyway I still believe there infinite ways to define the two operations of the vector space in such a way that the properties of a vector space hold.

Santiago24
Delta2 said:
Yes well, shortly after I posted #3 I realized that the redefined addition doesn't satisfy commutativity (and also probably not associativity) but I left my post as it was because as I said at start I believe there are infinite (if not infinite just too many ) ways to redefine the operations, I just presented one (failed) attempt to redefine addition.

what about if we redefine addition as multiplication, I mean $$(x_1,y_1)\oplus (x_2,y_2)=(x_1x_2,y_1y_2)$$ then this I think satisfies commutativity and associativity , the neutral element becomes (1,1) (instead of 0,0) and the opposite of (x,y) is (1/x,1/y). Is then ##R^2-(0,0)## a vector space under this redefined addition (and the usually defined multiplication by scalar)? What do you think @pasmith ?

That fails: (x,0) has no inverse. You can avoid that by taking your set of vectors to be $(\mathbb{R} \setminus \{0\})^2 \subsetneq \mathbb{R}^2 \setminus \{(0,0)\}$, but you still have a problem with distributivity of scalar multiplication: $$a(\mathbf{x}_1 \oplus \mathbf{x}_2) = (ax_1x_2, ay_1y_2) \neq (a^2x_1x_2,a^2y_1y_2) = (a\mathbf{x}_1) \oplus (a\mathbf{x}_2)$$ (and you would need to assign a meaning to $0\mathbf{x}$ since (0,0) is not one of your vectors).

Edit: $a(x,y) = (x^a, y^a)$ might work if you restrict your vectors to $(0,\infty)^2$.

Santiago24 and Delta2
pasmith said:
Edit: a(x,y)=(xa,ya) might work if you restrict your vectors to (0,∞)2.
Yes this redefinition of scalar multiplication would fix the distributivity problem and also assign (1,1) the neutral element to ##0\mathbf{x}##. And I don't see any other problems.

Santiago24

## 1. What is a vector space?

A vector space is a mathematical structure consisting of a set of objects, called vectors, and a set of operations that can be performed on those vectors. These operations include addition and scalar multiplication, and they must follow specific rules to be considered a vector space.

## 2. What are the two ways to define operations in a vector space?

The two ways to define operations in a vector space are through axioms and through a basis and coordinate system. Axioms are a set of rules that must be satisfied for a set to be considered a vector space. A basis and coordinate system involves choosing a set of basis vectors that can be used to represent any vector in the space, and then defining operations based on those basis vectors.

## 3. What are the benefits of defining operations in a vector space through axioms?

Defining operations in a vector space through axioms allows for a more general and abstract understanding of the space. It also allows for the creation of new vector spaces by satisfying the axioms with different types of objects.

## 4. How does a basis and coordinate system define operations in a vector space?

In a basis and coordinate system, the operations are defined by how the basis vectors behave under addition and scalar multiplication. The basis vectors are used to represent any vector in the space, and the operations are defined based on how these basis vectors interact with each other.

## 5. Can a vector space have more than one basis and coordinate system?

Yes, a vector space can have multiple basis and coordinate systems. This is because there are often many different sets of basis vectors that can be used to represent a vector space. However, all of these systems must still satisfy the axioms in order for the set to be considered a vector space.

• Linear and Abstract Algebra
Replies
18
Views
518
• Linear and Abstract Algebra
Replies
7
Views
442
• Linear and Abstract Algebra
Replies
3
Views
425
• Linear and Abstract Algebra
Replies
10
Views
503
• Linear and Abstract Algebra
Replies
8
Views
1K
• Linear and Abstract Algebra
Replies
9
Views
721
• Linear and Abstract Algebra
Replies
38
Views
5K
• Linear and Abstract Algebra
Replies
9
Views
383
• Linear and Abstract Algebra
Replies
4
Views
2K
• Linear and Abstract Algebra
Replies
15
Views
4K