Is This a Valid Vector Space with Unusual Operations?

Click For Summary

Discussion Overview

The discussion centers on whether a specific set of vectors defined in $\mathbb{R}^2$ with unconventional operations for addition and scalar multiplication forms a valid vector space. Participants explore various properties of vector spaces, including closure, associativity, the existence of a zero vector, and the distributive law.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant defines the vector addition and scalar multiplication operations and asks if the structure is a vector space.
  • Another participant claims to have closure, associativity, a zero vector, additive inverses, and commutativity, but notes that it does not obey the distributive law.
  • There is a discussion about how to verify the distributive property, with one participant expressing uncertainty about handling different scalars.
  • A later reply claims that the distributive law works, providing a detailed verification of the property.
  • One participant identifies the zero vector as $\begin{bmatrix}0 \\ 1\end{bmatrix}$ and discusses the additive inverse for a general vector, deriving conditions based on the operations defined.
  • Another participant expresses gratitude for the assistance received in the discussion.

Areas of Agreement / Disagreement

Participants generally agree on the existence of a zero vector and the need for additive inverses, but there is disagreement regarding the validity of the distributive law, with some claiming it holds and others questioning it.

Contextual Notes

The discussion includes various assumptions about the operations defined and their implications for vector space properties, which remain unresolved. The dependence on the condition $y > 0$ for the additive inverse is also noted.

karush
Gold Member
MHB
Messages
3,240
Reaction score
5
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
 
Last edited:
Physics news on Phys.org
karush said:
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
"zero vectors?" There's only one.

I've got closure, associativity, a zero vector, additive inverses, and it's even commutative. However it doesn't obey the distributive law.

Can you get these?

-Dan
 
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
 
karush said:
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
I had this whole blasted thing written out in LaTeX just to find out I made an error. The distributive law also works.

Here it is anyway.

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right )
= \left ( k \odot \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \right ) \oplus \left ( k \odot \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = \left [ \begin{matrix} kx_1 \\ y_1^k \end{matrix} \right ] \oplus \left [ \begin{matrix} kx_2 \\ y_2^k \end{matrix} \right ] = \left [ \begin{matrix} kx_1 + kx_2 \\ y_1^k y_2^k \end{matrix} \right ][/math]

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = k \odot \left [ \begin{matrix} x_1 + x_2 \\ y_1 y_2 \end{matrix} \right ] = \left [ \begin{matrix} k(x_1 + x_2 ) \\ (y_1 y_2)^k \end{matrix} \right ] [/math]

So they are the same.

-Dan
 
The 0 vector (additive identity) is $\begin{bmatrix}0 \\ 1\end{bmatrix}$: for any vector $v= \begin{bmatrix}a \\ b\end{bmatrix}$, $v+ 0= 0+ v= \begin{bmatrix}a+ 0 \\ b(1)\end{bmatrix}= \begin{bmatrix}a \\ b\end{bmatrix}= v$.

What about the additive inverse of $\begin{bmatrix}a \\ b\end{bmatrix}$? Calling that $\begin{bmatrix}p \\ q\end{bmatrix}$, We must have $\begin{bmatrix}a \\ b\end{bmatrix}+ \begin{bmatrix}p \\ q \end{bmatrix}= \begin{bmatrix}a+ p \\ bq \end{bmatrix}= \begin{bmatrix} 0 \\ 1\end{pmatrix}$ so we have a+ p= 0 and bq= 1 so we must have p= -a and q= 1/b. That is the reason for the condition "y> 0".
 
That was a great help ..
Much Mahalo

It hard to find really good help with these
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
8
Views
2K
  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K