MHB Prove the statements : Vectors/Matrices

  • Thread starter Thread starter mathmari
  • Start date Start date
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

Let $1\leq n\in \mathbb{N}$.

  • Prove that for all $v\in \mathbb{R}^n$ it holds that $v+0_{\mathbb{R}^n}=v=0_{\mathbb{R}^n}+v$.
  • Prove that for all $\lambda\in \mathbb{R}$ and $v,w\in \mathbb{R}$ it holds that $\lambda (v+w)=\lambda v+\lambda w$.
  • Let $M_2(\mathbb{R}):=\left \{\begin{pmatrix}a & b \\ c & d\end{pmatrix}\mid a, b, c, d\in \mathbb{R}\right \}$ the set of all $2\times 2$-matrices over $\mathbb{R}$. We define also the multiplication on that set as \begin{equation*}\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}a' & b' \\ c' & d'\end{pmatrix}=\begin{pmatrix}aa'+bc' & ab'+bd' \\ ca'+dc' & cb'+dd'\end{pmatrix}\end{equation*}
    1. Show that the multiplication over $M_2(\mathbb{R})$ is associative.
    2. Is the multiplication over $M_2(\mathbb{R})$ commutative?
    3. Is there a neutral element in respect of the multiplication over $M_2(\mathbb{R})$ ?
I have done the following:

  • How can we prove this property? (Wondering)
    $$$$
  • Could you give me also a hint for this one? (Wondering)
    $$$$
    1. Let $A=\begin{pmatrix}a & b \\ c & d\end{pmatrix}, \ B=\begin{pmatrix}e & f \\ g & h\end{pmatrix}, \ C=\begin{pmatrix}i & j \\ k & \ell\end{pmatrix}$.

      Then we have the following: \begin{align*}(A\cdot B)\cdot C&=\left (\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}e & f \\ g & h\end{pmatrix}\right )\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}= \begin{pmatrix}ae+bg & af+bh \\ ce+dg & cf+dh\end{pmatrix}\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}\\ & = \begin{pmatrix}(ae+bg)i+(af+bh)k & (ae+bg)j+(af+bh)\ell \\ (ce+dg)i+(cf+dh)k & (ce+dg)j+(cf+dh)\ell\end{pmatrix}\\ & = \begin{pmatrix}aei+bgi+afk+bhk & aej+bgj+af\ell+bh\ell \\ cei+dgi+cfk+dhk & cej+dgj+cf\ell+dh\ell\end{pmatrix}\end{align*}

      \begin{align*}A\cdot (B\cdot C)&=\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot\left ( \begin{pmatrix}e & f \\ g & h\end{pmatrix}\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}\right )= \begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot\begin{pmatrix}ei+fk & ej+f\ell \\ gi+hk & gj+h\ell\end{pmatrix} \\ & = \begin{pmatrix}a(ei+fk)+b(gi+hk) & a(ej+f\ell)+b(gj+h\ell) \\ c(ei+fk)+d(gi+hk) & c(ej+f\ell)+d(gj+h\ell)\end{pmatrix} \\ & = \begin{pmatrix}aei+afk+bgi+bhk & aej+af\ell+bgj+bh\ell \\ cei+cfk+dgi+dhk & cej+cf\ell+dgj+dh\ell\end{pmatrix}\\ & = \begin{pmatrix}aei+bgi+afk+bhk & aej+bgj+af\ell+bh\ell \\ cei+dgi+cfk+dhk & cej+dgj+cf\ell+dh\ell\end{pmatrix}\end{align*}

      The results are the same. Therefore it holds that $(A\cdot B)\cdot C=A\cdot (B\cdot C)$ which means the multiplication over $M_2(\mathbb{R})$ is associative.
    2. Let $A=\begin{pmatrix}a & b \\ c & d\end{pmatrix}, \ B=\begin{pmatrix}e & f \\ g & h\end{pmatrix}$.

      Then we have the following: \begin{equation*}A\cdot B=\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}e & f \\ g & h\end{pmatrix}=\begin{pmatrix}ae+bg & af+bh \\ ce+dg & cf+dh\end{pmatrix} \end{equation*}

      Then we have the following: \begin{equation*}B\cdot A=\begin{pmatrix}e & f \\ g & h\end{pmatrix}\cdot \begin{pmatrix}a & b \\ c & d\end{pmatrix}=\begin{pmatrix}ea+fc & eb+fd \\ ga+hc & gb+hd\end{pmatrix} \end{equation*}

      We see that $A\cdot B\neq B\cdot A$, which means that the multiplication over $M_2(\mathbb{R})$ is not commutative.
    3. The neutral element in respect of the multiplication over $M_2(\mathbb{R})$ is the identity matrix \begin{equation*}I_2=\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\end{equation*}
    Is what I have done correct and complete? (Wondering)
 
Physics news on Phys.org
Hey mathmari!

What is the definition of addition in $\mathbb R^n$? (Wondering)
It should be defined in terms of addition on $\mathbb R$.
And of scalar multiplication?

The rest looks all good to me. (Smile)
 
Klaas van Aarsen said:
What is the definition of addition in $\mathbb R^n$? (Wondering)
It should be defined in terms of addition on $\mathbb R$.
And of scalar multiplication?

We define addition in $\mathbb{R}^n$ componentwise: $x+y=(x_1, \ldots , x_n)+(y_1, \ldots , y_n)=(x_1+y_1, \ldots , x_n+y_n)$.
We have that $0_{\mathbb{R}^n}=(0,\ldots , 0)$.

So we have that \begin{equation*} v+0_{\mathbb{R}^n}=(v_1, \ldots , v_n)+(0,\ldots , 0)=(v_1+0, \ldots , v_n+0)=(v_1, \ldots , v_n)=v\end{equation*}
Similarily, we get \begin{equation*}0_{\mathbb{R}^n}+v=(0,\ldots , 0)+(v_1, \ldots , v_n)=(0+v_1, \ldots , 0+v_n)=(v_1, \ldots , v_n)=v\end{equation*}

Therefore, it holds that $v+0_{\mathbb{R}^n}=v=0_{\mathbb{R}^n}+v$.
The scalar multiplication is defined as follows:
$\lambda x=\lambda (x_1, \ldots , x_n)=(\lambda x_1, \ldots , \lambda x_n)$.

Then we get \begin{align*}\lambda (v+w)&=\lambda \left ((v_1, \ldots , v_n)+(w_1, \ldots , w_n)\right )=\lambda \left ((v_1+w_1, \ldots , v_n+w_n)\right ) \\ & =(\lambda(v_1+w_1), \ldots , \lambda(v_n+w_n))=(\lambda v_1+\lambda w_1, \ldots , \lambda v_n+\lambda w_n) \\ & =(\lambda v_1, \ldots , \lambda v_n)+(\lambda w_1, \ldots , \lambda w_n) =\lambda( v_1, \ldots , v_n)+ \lambda (w_1, \ldots , w_n) \\ & =\lambda v+\lambda w\end{align*} Are these proofs correct and complete? (Wondering)
 
Yep. All correct. (Nod)
 
Klaas van Aarsen said:
Yep. All correct. (Nod)

Great! Thank you! (Sun)
 
mathmari said:
We see that $A\cdot B\neq B\cdot A$
For which values of $a,\ldots,h$? For all? For some? Then for which exactly? Expressions $a^{3}+b^{3}+c^{3}-3abc$ and $(a+b+c)(a^{2} +b^{2}+c^{2}-ab-bc-ca)$ also look quite differently, yet they are equal.
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Back
Top