MHB Is This a Valid Vector Space with Unusual Operations?

Click For Summary
The discussion centers on determining whether a set of vectors defined with specific addition and scalar multiplication operations forms a valid vector space. The operations involve adding vectors by summing their first components and multiplying their second components, while scalar multiplication scales the first component and raises the second component to the scalar power. Key properties such as closure, associativity, and the existence of a zero vector, identified as [0, 1], are confirmed, along with additive inverses. However, the distributive law was initially questioned but later verified to hold true. Ultimately, the set satisfies the criteria for a vector space under the defined operations.
karush
Gold Member
MHB
Messages
3,240
Reaction score
5
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
 
Last edited:
Physics news on Phys.org
karush said:
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
"zero vectors?" There's only one.

I've got closure, associativity, a zero vector, additive inverses, and it's even commutative. However it doesn't obey the distributive law.

Can you get these?

-Dan
 
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
 
karush said:
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
I had this whole blasted thing written out in LaTeX just to find out I made an error. The distributive law also works.

Here it is anyway.

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right )
= \left ( k \odot \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \right ) \oplus \left ( k \odot \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = \left [ \begin{matrix} kx_1 \\ y_1^k \end{matrix} \right ] \oplus \left [ \begin{matrix} kx_2 \\ y_2^k \end{matrix} \right ] = \left [ \begin{matrix} kx_1 + kx_2 \\ y_1^k y_2^k \end{matrix} \right ][/math]

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = k \odot \left [ \begin{matrix} x_1 + x_2 \\ y_1 y_2 \end{matrix} \right ] = \left [ \begin{matrix} k(x_1 + x_2 ) \\ (y_1 y_2)^k \end{matrix} \right ] [/math]

So they are the same.

-Dan
 
The 0 vector (additive identity) is $\begin{bmatrix}0 \\ 1\end{bmatrix}$: for any vector $v= \begin{bmatrix}a \\ b\end{bmatrix}$, $v+ 0= 0+ v= \begin{bmatrix}a+ 0 \\ b(1)\end{bmatrix}= \begin{bmatrix}a \\ b\end{bmatrix}= v$.

What about the additive inverse of $\begin{bmatrix}a \\ b\end{bmatrix}$? Calling that $\begin{bmatrix}p \\ q\end{bmatrix}$, We must have $\begin{bmatrix}a \\ b\end{bmatrix}+ \begin{bmatrix}p \\ q \end{bmatrix}= \begin{bmatrix}a+ p \\ bq \end{bmatrix}= \begin{bmatrix} 0 \\ 1\end{pmatrix}$ so we have a+ p= 0 and bq= 1 so we must have p= -a and q= 1/b. That is the reason for the condition "y> 0".
 
That was a great help ..
Much Mahalo

It hard to find really good help with these
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
8
Views
2K
  • · Replies 24 ·
Replies
24
Views
2K
Replies
27
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K