# Prove the statements : Vectors/Matrices

• MHB
Gold Member
MHB
Hey!

Let $1\leq n\in \mathbb{N}$.

• Prove that for all $v\in \mathbb{R}^n$ it holds that $v+0_{\mathbb{R}^n}=v=0_{\mathbb{R}^n}+v$.
• Prove that for all $\lambda\in \mathbb{R}$ and $v,w\in \mathbb{R}$ it holds that $\lambda (v+w)=\lambda v+\lambda w$.
• Let $M_2(\mathbb{R}):=\left \{\begin{pmatrix}a & b \\ c & d\end{pmatrix}\mid a, b, c, d\in \mathbb{R}\right \}$ the set of all $2\times 2$-matrices over $\mathbb{R}$. We define also the multiplication on that set as \begin{equation*}\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}a' & b' \\ c' & d'\end{pmatrix}=\begin{pmatrix}aa'+bc' & ab'+bd' \\ ca'+dc' & cb'+dd'\end{pmatrix}\end{equation*}
1. Show that the multiplication over $M_2(\mathbb{R})$ is associative.
2. Is the multiplication over $M_2(\mathbb{R})$ commutative?
3. Is there a neutral element in respect of the multiplication over $M_2(\mathbb{R})$ ?

I have done the following:

• How can we prove this property? (Wondering)

• Could you give me also a hint for this one? (Wondering)

1. Let $A=\begin{pmatrix}a & b \\ c & d\end{pmatrix}, \ B=\begin{pmatrix}e & f \\ g & h\end{pmatrix}, \ C=\begin{pmatrix}i & j \\ k & \ell\end{pmatrix}$.

Then we have the following: \begin{align*}(A\cdot B)\cdot C&=\left (\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}e & f \\ g & h\end{pmatrix}\right )\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}= \begin{pmatrix}ae+bg & af+bh \\ ce+dg & cf+dh\end{pmatrix}\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}\\ & = \begin{pmatrix}(ae+bg)i+(af+bh)k & (ae+bg)j+(af+bh)\ell \\ (ce+dg)i+(cf+dh)k & (ce+dg)j+(cf+dh)\ell\end{pmatrix}\\ & = \begin{pmatrix}aei+bgi+afk+bhk & aej+bgj+af\ell+bh\ell \\ cei+dgi+cfk+dhk & cej+dgj+cf\ell+dh\ell\end{pmatrix}\end{align*}

\begin{align*}A\cdot (B\cdot C)&=\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot\left ( \begin{pmatrix}e & f \\ g & h\end{pmatrix}\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}\right )= \begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot\begin{pmatrix}ei+fk & ej+f\ell \\ gi+hk & gj+h\ell\end{pmatrix} \\ & = \begin{pmatrix}a(ei+fk)+b(gi+hk) & a(ej+f\ell)+b(gj+h\ell) \\ c(ei+fk)+d(gi+hk) & c(ej+f\ell)+d(gj+h\ell)\end{pmatrix} \\ & = \begin{pmatrix}aei+afk+bgi+bhk & aej+af\ell+bgj+bh\ell \\ cei+cfk+dgi+dhk & cej+cf\ell+dgj+dh\ell\end{pmatrix}\\ & = \begin{pmatrix}aei+bgi+afk+bhk & aej+bgj+af\ell+bh\ell \\ cei+dgi+cfk+dhk & cej+dgj+cf\ell+dh\ell\end{pmatrix}\end{align*}

The results are the same. Therefore it holds that $(A\cdot B)\cdot C=A\cdot (B\cdot C)$ which means the multiplication over $M_2(\mathbb{R})$ is associative.
2. Let $A=\begin{pmatrix}a & b \\ c & d\end{pmatrix}, \ B=\begin{pmatrix}e & f \\ g & h\end{pmatrix}$.

Then we have the following: \begin{equation*}A\cdot B=\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}e & f \\ g & h\end{pmatrix}=\begin{pmatrix}ae+bg & af+bh \\ ce+dg & cf+dh\end{pmatrix} \end{equation*}

Then we have the following: \begin{equation*}B\cdot A=\begin{pmatrix}e & f \\ g & h\end{pmatrix}\cdot \begin{pmatrix}a & b \\ c & d\end{pmatrix}=\begin{pmatrix}ea+fc & eb+fd \\ ga+hc & gb+hd\end{pmatrix} \end{equation*}

We see that $A\cdot B\neq B\cdot A$, which means that the multiplication over $M_2(\mathbb{R})$ is not commutative.
3. The neutral element in respect of the multiplication over $M_2(\mathbb{R})$ is the identity matrix \begin{equation*}I_2=\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\end{equation*}
Is what I have done correct and complete? (Wondering)

Homework Helper
MHB
Hey mathmari!

What is the definition of addition in $\mathbb R^n$? (Wondering)
It should be defined in terms of addition on $\mathbb R$.
And of scalar multiplication?

The rest looks all good to me. (Smile)

Gold Member
MHB
What is the definition of addition in $\mathbb R^n$? (Wondering)
It should be defined in terms of addition on $\mathbb R$.
And of scalar multiplication?

We define addition in $\mathbb{R}^n$ componentwise: $x+y=(x_1, \ldots , x_n)+(y_1, \ldots , y_n)=(x_1+y_1, \ldots , x_n+y_n)$.
We have that $0_{\mathbb{R}^n}=(0,\ldots , 0)$.

So we have that \begin{equation*} v+0_{\mathbb{R}^n}=(v_1, \ldots , v_n)+(0,\ldots , 0)=(v_1+0, \ldots , v_n+0)=(v_1, \ldots , v_n)=v\end{equation*}
Similarily, we get \begin{equation*}0_{\mathbb{R}^n}+v=(0,\ldots , 0)+(v_1, \ldots , v_n)=(0+v_1, \ldots , 0+v_n)=(v_1, \ldots , v_n)=v\end{equation*}

Therefore, it holds that $v+0_{\mathbb{R}^n}=v=0_{\mathbb{R}^n}+v$.

The scalar multiplication is defined as follows:
$\lambda x=\lambda (x_1, \ldots , x_n)=(\lambda x_1, \ldots , \lambda x_n)$.

Then we get \begin{align*}\lambda (v+w)&=\lambda \left ((v_1, \ldots , v_n)+(w_1, \ldots , w_n)\right )=\lambda \left ((v_1+w_1, \ldots , v_n+w_n)\right ) \\ & =(\lambda(v_1+w_1), \ldots , \lambda(v_n+w_n))=(\lambda v_1+\lambda w_1, \ldots , \lambda v_n+\lambda w_n) \\ & =(\lambda v_1, \ldots , \lambda v_n)+(\lambda w_1, \ldots , \lambda w_n) =\lambda( v_1, \ldots , v_n)+ \lambda (w_1, \ldots , w_n) \\ & =\lambda v+\lambda w\end{align*}

Are these proofs correct and complete? (Wondering)

Homework Helper
MHB
Yep. All correct. (Nod)

Gold Member
MHB
Yep. All correct. (Nod)

Great! Thank you! (Sun)

Gold Member
MHB
We see that $A\cdot B\neq B\cdot A$
For which values of $a,\ldots,h$? For all? For some? Then for which exactly? Expressions $a^{3}+b^{3}+c^{3}-3abc$ and $(a+b+c)(a^{2} +b^{2}+c^{2}-ab-bc-ca)$ also look quite differently, yet they are equal.