MHB Is This a Valid Vector Space with Unusual Operations?

karush
Gold Member
MHB
Messages
3,240
Reaction score
5
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
 
Last edited:
Physics news on Phys.org
karush said:
On the set of vectors
$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix}\in \Bbb{R}^2 $
with $x_1 \in \Bbb{R}$, and $y_1$ in $\Bbb{R}^{+}$ (meaning $y_1 >0$) define an addition by
$$\begin{bmatrix}
x_1 \\ y_1
\end{bmatrix} \oplus
\begin{bmatrix}
x_2 \\ y_2
\end{bmatrix}
=
\begin{bmatrix}
x_1 + x_2 \\ y_1y_2
\end{bmatrix}$$
and a scalar multiplication by
$$ k \odot
\begin{bmatrix}
x \\ y
\end{bmatrix} =
\begin{bmatrix}
k x \\ y^{k}
\end{bmatrix}.
$$
Determine if this is a vector space.
If it is, make sure to explicitly state what the $0$ vector is.
OK the only the only thing I could come up with was $2+2=4$ and $2\cdot 2=4$
and zero vectors are orthogonal with $k=2$
"zero vectors?" There's only one.

I've got closure, associativity, a zero vector, additive inverses, and it's even commutative. However it doesn't obey the distributive law.

Can you get these?

-Dan
 
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
 
karush said:
ok I don't know how you would try the distributive property since the scalar was different
Distributive law: For all real numbers c and all vectors $u, v \in V$, $ c\cdot(u + v) = c\cdot u + c\cdot v$
I had this whole blasted thing written out in LaTeX just to find out I made an error. The distributive law also works.

Here it is anyway.

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right )
= \left ( k \odot \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \right ) \oplus \left ( k \odot \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = \left [ \begin{matrix} kx_1 \\ y_1^k \end{matrix} \right ] \oplus \left [ \begin{matrix} kx_2 \\ y_2^k \end{matrix} \right ] = \left [ \begin{matrix} kx_1 + kx_2 \\ y_1^k y_2^k \end{matrix} \right ][/math]

[math]k \odot \left ( \left [ \begin{matrix} x_1 \\ y_1 \end{matrix} \right ] \oplus \left [ \begin{matrix} x_2 \\ y_2 \end{matrix} \right ] \right ) = k \odot \left [ \begin{matrix} x_1 + x_2 \\ y_1 y_2 \end{matrix} \right ] = \left [ \begin{matrix} k(x_1 + x_2 ) \\ (y_1 y_2)^k \end{matrix} \right ] [/math]

So they are the same.

-Dan
 
The 0 vector (additive identity) is $\begin{bmatrix}0 \\ 1\end{bmatrix}$: for any vector $v= \begin{bmatrix}a \\ b\end{bmatrix}$, $v+ 0= 0+ v= \begin{bmatrix}a+ 0 \\ b(1)\end{bmatrix}= \begin{bmatrix}a \\ b\end{bmatrix}= v$.

What about the additive inverse of $\begin{bmatrix}a \\ b\end{bmatrix}$? Calling that $\begin{bmatrix}p \\ q\end{bmatrix}$, We must have $\begin{bmatrix}a \\ b\end{bmatrix}+ \begin{bmatrix}p \\ q \end{bmatrix}= \begin{bmatrix}a+ p \\ bq \end{bmatrix}= \begin{bmatrix} 0 \\ 1\end{pmatrix}$ so we have a+ p= 0 and bq= 1 so we must have p= -a and q= 1/b. That is the reason for the condition "y> 0".
 
That was a great help ..
Much Mahalo

It hard to find really good help with these
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top