Is This a Valid Vector Space with Unusual Operations?

Click For Summary
SUMMARY

The discussion confirms that the defined operations on the set of vectors $\begin{bmatrix} x_1 \\ y_1 \end{bmatrix} \in \mathbb{R}^2$ with $y_1 > 0$ form a valid vector space. The addition operation $\oplus$ and scalar multiplication $\odot$ satisfy closure, associativity, commutativity, and the existence of a zero vector, which is $\begin{bmatrix} 0 \\ 1 \end{bmatrix}$. The distributive law holds true, confirming the structure adheres to vector space axioms.

PREREQUISITES
  • Understanding of vector spaces and their properties
  • Familiarity with vector addition and scalar multiplication
  • Knowledge of the distributive law in vector spaces
  • Basic proficiency in LaTeX for mathematical representation
NEXT STEPS
  • Explore the properties of vector spaces in linear algebra
  • Study examples of non-standard vector operations
  • Learn about the implications of scalar multiplication in vector spaces
  • Investigate the role of zero vectors in various vector space definitions
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in advanced vector space concepts and operations.

karush
Gold Member
MHB
Messages
3,240
Reaction score
5
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
 
Last edited:
Physics news on Phys.org
karush said:
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
"zero vectors?" There's only one.

I've got closure, associativity, a zero vector, additive inverses, and it's even commutative. However it doesn't obey the distributive law.

Can you get these?

-Dan
 
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
 
karush said:
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
I had this whole blasted thing written out in LaTeX just to find out I made an error. The distributive law also works.

Here it is anyway.

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right )
= \left ( k \odot \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \right ) \oplus \left ( k \odot \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = \left [ \begin{matrix} kx_1 \\ y_1^k \end{matrix} \right ] \oplus \left [ \begin{matrix} kx_2 \\ y_2^k \end{matrix} \right ] = \left [ \begin{matrix} kx_1 + kx_2 \\ y_1^k y_2^k \end{matrix} \right ][/math]

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = k \odot \left [ \begin{matrix} x_1 + x_2 \\ y_1 y_2 \end{matrix} \right ] = \left [ \begin{matrix} k(x_1 + x_2 ) \\ (y_1 y_2)^k \end{matrix} \right ] [/math]

So they are the same.

-Dan
 
The 0 vector (additive identity) is $\begin{bmatrix}0 \\ 1\end{bmatrix}$: for any vector $v= \begin{bmatrix}a \\ b\end{bmatrix}$, $v+ 0= 0+ v= \begin{bmatrix}a+ 0 \\ b(1)\end{bmatrix}= \begin{bmatrix}a \\ b\end{bmatrix}= v$.

What about the additive inverse of $\begin{bmatrix}a \\ b\end{bmatrix}$? Calling that $\begin{bmatrix}p \\ q\end{bmatrix}$, We must have $\begin{bmatrix}a \\ b\end{bmatrix}+ \begin{bmatrix}p \\ q \end{bmatrix}= \begin{bmatrix}a+ p \\ bq \end{bmatrix}= \begin{bmatrix} 0 \\ 1\end{pmatrix}$ so we have a+ p= 0 and bq= 1 so we must have p= -a and q= 1/b. That is the reason for the condition "y> 0".
 
That was a great help ..
Much Mahalo

It hard to find really good help with these
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
8
Views
2K
  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K