MHB No prefix added: How do I find a basis of a $\mathbb{R}$-vector space?

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Basis Space
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

I want to prove that $$V=\left \{\begin{pmatrix}a & b\\ c & d\end{pmatrix} \mid a,b,c,d\in \mathbb{C} \text{ and } a+d\in \mathbb{R}\right \}$$ is a $\mathbb{R}$-vector space.
I want to find also a basis of $V$ as a $\mathbb{R}$-vector space. We have the following:
Let $K$ be a field. A vector space over $K$ (or $K$-vector space) is a set $V$ with an addition $V \times V \rightarrow V : (x, y) \mapsto x + y$ and a scalar multiplication $K \times V \rightarrow V : (\lambda , x) \mapsto \lambda \cdot x$, so that the following holds:
  • (V1) : $(V,+)$ is an abelian group, with the neutral element $0$.
  • (V2) : $\forall a, b \in K, \forall x \in V : (a + b) \cdot x = a \cdot x + b \cdot x$
  • (V3) : $\forall a \in K, \forall x, y \in V : a \cdot (x + y) = a \cdot x + a \cdot y$
  • (V4) : $\forall a, b \in K, \forall x \in V : (ab) \cdot x = a \cdot (b \cdot x)$
  • (V5) : $\forall x \in V : 1 \cdot x = x$ ( $1 = 1_K$ is the identity in $K$).
We have that it is closed under addition and multiplication, right? (Wondering)

The properties 2-5 are also satisfied, or not? How can we check the property 1? (Wondering) Could you give me a hint how to find a basis? (Wondering)
 
Physics news on Phys.org
\begin{pmatrix}a & b \\ c & d \end{pmatrix}= \begin{pmatrix}a & 0 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & b \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ 0 & d \end{pmatrix}
= a\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}+ b\begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}+ c\begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}+ d\begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}
 
HallsofIvy said:
\begin{pmatrix}a & b \\ c & d \end{pmatrix}= \begin{pmatrix}a & 0 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & b \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ 0 & d \end{pmatrix}
= a\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}+ b\begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}+ c\begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}+ d\begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}

Ah! And so, the basis is $$\{\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}\}$$ right? (Wondering)
 
I think we need a couple more elements in the basis, since the entries in the matrix are complex, but the scalars we multiply by are real.
Furthermore, we have the restriction that the imaginary part of the trace must be zero.
I think we need 7 elements in the basis. (Thinking)

mathmari said:
We have that it is closed under addition and multiplication, right? (Wondering)

The properties 2-5 are also satisfied, or not? How can we check the property 1? (Wondering)

Yes.
For property 1, we already know that regular matrix addition of complex matrices is an abelian group.
So it's sufficient to verify it's a subgroup, meaning we have to verify if the neutral element is an element of the group, and if all additive inverses are elements of the group. (Thinking)
 
I like Serena said:
I think we need a couple more elements in the basis, since the entries in the matrix are complex, but the scalars we multiply by are real.
Furthermore, we have the restriction that the imaginary part of the trace must be zero.
I think we need 7 elements in the basis. (Thinking)

Ah ok... (Thinking)

Is the basis the followig?
$$\left \{\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}, \begin{pmatrix}0 & 0 \\ 0 & 1 \end{pmatrix}, \begin{pmatrix} i & 0 \\ 0 & -i \end{pmatrix}, \begin{pmatrix} 0 & i \\ 0 & 0 \end{pmatrix},\begin{pmatrix} 0 & 0 \\ i & 0 \end{pmatrix} \right \}$$

(Wondering)
 
Yes (Mmm)
 
I like Serena said:
For property 1, we already know that regular matrix addition of complex matrices is an abelian group.
So it's sufficient to verify it's a subgroup, meaning we have to verify if the neutral element is an element of the group, and if all additive inverses are elements of the group. (Thinking)

We have that $\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}\in V$ since $0\in\mathbb{R}\subset \mathbb{C}$.

When $\begin{pmatrix} a & b \\ c & d \end{pmatrix}\in V$ then $\begin{pmatrix} -a & -b \\ -c & -d \end{pmatrix}\in V$, since $-a,-b,-c,-d\in \mathbb{C}$ and $-a-d=-(a+d)\in \mathbb{R}$, since $a+d\in \mathbb{R}$.

Is this correct? (Wondering)
 
Last edited by a moderator:
Yup. (Mmm)
 
We have the following:

\begin{align*}\begin{pmatrix}a & b \\ c & d \end{pmatrix}& = \begin{pmatrix}a_1+a_2i & b_1+b_2i \\ c_1+c_2i & d_1+d_2i \end{pmatrix} = \begin{pmatrix}a_1 & b_1 \\ c_1 & d_1 \end{pmatrix}+\begin{pmatrix}a_2i & b_2i \\ c_2i & d_2i \end{pmatrix} \\ &=
\left (\begin{pmatrix}a_1 & 0 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & b_1 \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c_1 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ 0 & d_1 \end{pmatrix}\right )+ \left (\begin{pmatrix}a_2i & 0 \\ 0 & d_2i \end{pmatrix}+ \begin{pmatrix}0 & b_2i \\ 0 & 0 \end{pmatrix}+ \begin{pmatrix}0 & 0 \\ c_2i & 0 \end{pmatrix} \right ) \\ &=
\left (a_1\begin{pmatrix}1 & 0 \\ 0 & 0 \end{pmatrix}+ b_1\begin{pmatrix}0 & 1 \\ 0 & 0 \end{pmatrix}+ c_1\begin{pmatrix}0 & 0 \\ 1 & 0 \end{pmatrix}+ d_1\begin{pmatrix}0 & 0 \\ 0 &1 \end{pmatrix}\right )+ \left (\begin{pmatrix}a_2i & 0 \\ 0 & d_2i \end{pmatrix}+ b_2\begin{pmatrix}0 & i \\ 0 & 0 \end{pmatrix}+ c_2\begin{pmatrix}0 & 0 \\ i & 0 \end{pmatrix} \right )\end{align*}

How do we get from $\begin{pmatrix}a_2i & 0 \\ 0 & d_2i \end{pmatrix}$ the $\begin{pmatrix}i & 0 \\ 0 & -i \end{pmatrix}$ ? (Wondering)
 
  • #10
The trace of the matrix must be real, so $\text{Im}(a+b) = 0 \quad\Rightarrow\quad d_2 = -a_2$. (Thinking)
 
  • #11
I like Serena said:
The trace of the matrix must be real, so $\text{Im}(a+b) = 0 \quad\Rightarrow\quad d_2 = -a_2$. (Thinking)

Ah ok... I see! To show that $V$ is a real vector space, could we maybe show that it is a subvector space of the real vector space of the complex matrices? (Wondering)

We have that $I_2=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}$ is an element of $V$ since $1,0\in \mathbb{C}$ and $1+1=2\in \mathbb{R}$.

So, the set is non-empty. We have the followig:
$$\begin{pmatrix}a_1 & b_1\\ c_1 & d_1\end{pmatrix}+\begin{pmatrix}a_2 & b_2\\ c_2 & d_2\end{pmatrix}=\begin{pmatrix}a_1+a_2 & b_1+b_2\\ c_1+c_2 & d_1+d_2\end{pmatrix}\in V$$
Since $(a_1+a_2)+(d_1+d_2)=(a_1+d_1)+(a_2+d_2)\in \mathbb{R}$, since $(a_1+d_1)\in \mathbb{R}$ and $(a_2+d_2)\in \mathbb{R}$.

We have also that $$r\begin{pmatrix}a & b\\ c & d\end{pmatrix}=\begin{pmatrix}ra & rb\\ rc & rd\end{pmatrix}\in V$$
since $ra+rd=r(a+d)\in \mathbb{R}$, since $a+d\in \mathbb{R}$ and $r\in \mathbb{R}$. Or is it not known that the set of complex matrices is a real vector space? (Wondering)
 
  • #12
mathmari said:
Ah ok... I see! To show that $V$ is a real vector space, could we maybe show that it is a subvector space of the real vector space of the complex matrices? (Wondering)

What's a "real" vector space? (Wondering)

The real vector space is the space of the real matrices with real scalar multiplication.
The complex vector space is the space of the complex matrices with complex scalar multiplication, which is indeed a known vector space.
So yes, it suffices if we can show that $V$ is a sub vector space of the complex vector space.
We have that $I_2=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}$ is an element of $V$ since $1,0\in \mathbb{C}$ and $1+1=2\in \mathbb{R}$.

So, the set is non-empty. We have the followig:
$$\begin{pmatrix}a_1 & b_1\\ c_1 & d_1\end{pmatrix}+\begin{pmatrix}a_2 & b_2\\ c_2 & d_2\end{pmatrix}=\begin{pmatrix}a_1+a_2 & b_1+b_2\\ c_1+c_2 & d_1+d_2\end{pmatrix}\in V$$
Since $(a_1+a_2)+(d_1+d_2)=(a_1+d_1)+(a_2+d_2)\in \mathbb{R}$, since $(a_1+d_1)\in \mathbb{R}$ and $(a_2+d_2)\in \mathbb{R}$.

We have also that $$r\begin{pmatrix}a & b\\ c & d\end{pmatrix}=\begin{pmatrix}ra & rb\\ rc & rd\end{pmatrix}\in V$$
since $ra+rd=r(a+d)\in \mathbb{R}$, since $a+d\in \mathbb{R}$ and $r\in \mathbb{R}$.

The identity matrix is not necessarily part of our vector space, since we're not multiplying matrices.

The zero matrix does have to be an element (implying it's not empty), just like every additive inverse. (Thinking)

And you have already shown closure for addition and closure for scalar multiplication.
 
  • #13
I like Serena said:
The identity matrix is not necessarily part of our vector space, since we're not multiplying matrices.

The zero matrix does have to be an element (implying it's not empty), just like every additive inverse. (Thinking)

And you have already shown closure for addition and closure for scalar multiplication.

Do we know that the zero matrix exists because for each $u,v\in V$ it must hold that $u+v\in V$, and also for $u=-v$ ? (Wondering)
 
  • #14
mathmari said:
Do we know that the zero matrix exists because for each $u,v\in V$ it must hold that $u+v\in V$, and also for $u=-v$ ? (Wondering)

That's one way yes.
With the proof that all additive inverses are in $V$, it follows that the zero matrix is in there as well.
Alternatively we can also tell because its trace, which is $0+0=0$, is a real number. (Thinking)
 
  • #15
I like Serena said:
That's one way yes.
With the proof that all additive inverses are in $V$, it follows that the zero matrix is in there as well.
Alternatively we can also tell because its trace, which is $0+0=0$, is a real number. (Thinking)

I see... Thank you very much! (Mmm)
 

Similar threads

Back
Top