Prove the statements : Vectors/Matrices

  • Context: MHB 
  • Thread starter Thread starter mathmari
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around proving various properties of vectors and matrices, specifically in the context of vector addition, scalar multiplication, and matrix multiplication. Participants explore the definitions and properties of these operations in both theoretical and practical frameworks.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant presents proofs for the properties of vector addition and scalar multiplication in $\mathbb{R}^n$, asserting that $v + 0_{\mathbb{R}^n} = v = 0_{\mathbb{R}^n} + v$ and $\lambda(v + w) = \lambda v + \lambda w$.
  • Another participant questions the definitions of addition and scalar multiplication in $\mathbb{R}^n$, suggesting they should be defined in terms of operations on $\mathbb{R}$.
  • A participant provides a detailed explanation of vector addition and scalar multiplication, confirming the earlier claims about their properties.
  • Multiple participants agree that the proofs presented regarding vector operations are correct and complete.
  • Discussion on matrix multiplication reveals that $A \cdot B \neq B \cdot A$, but questions arise about the specific values of the matrix elements that lead to this non-commutativity.
  • One participant references algebraic expressions to illustrate that different forms can yield the same result, prompting further inquiry into the conditions under which matrix multiplication is non-commutative.

Areas of Agreement / Disagreement

There is general agreement on the correctness of the proofs related to vector addition and scalar multiplication. However, the discussion on matrix multiplication remains unresolved, with participants questioning the specific conditions under which non-commutativity occurs.

Contextual Notes

The discussion includes assumptions about the definitions of vector operations and matrix multiplication that are not explicitly stated. The exploration of matrix multiplication's non-commutativity lacks specific examples or conditions that clarify the inquiry.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

Let $1\leq n\in \mathbb{N}$.

  • Prove that for all $v\in \mathbb{R}^n$ it holds that $v+0_{\mathbb{R}^n}=v=0_{\mathbb{R}^n}+v$.
  • Prove that for all $\lambda\in \mathbb{R}$ and $v,w\in \mathbb{R}$ it holds that $\lambda (v+w)=\lambda v+\lambda w$.
  • Let $M_2(\mathbb{R}):=\left \{\begin{pmatrix}a & b \\ c & d\end{pmatrix}\mid a, b, c, d\in \mathbb{R}\right \}$ the set of all $2\times 2$-matrices over $\mathbb{R}$. We define also the multiplication on that set as \begin{equation*}\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}a' & b' \\ c' & d'\end{pmatrix}=\begin{pmatrix}aa'+bc' & ab'+bd' \\ ca'+dc' & cb'+dd'\end{pmatrix}\end{equation*}
    1. Show that the multiplication over $M_2(\mathbb{R})$ is associative.
    2. Is the multiplication over $M_2(\mathbb{R})$ commutative?
    3. Is there a neutral element in respect of the multiplication over $M_2(\mathbb{R})$ ?
I have done the following:

  • How can we prove this property? (Wondering)
    $$$$
  • Could you give me also a hint for this one? (Wondering)
    $$$$
    1. Let $A=\begin{pmatrix}a & b \\ c & d\end{pmatrix}, \ B=\begin{pmatrix}e & f \\ g & h\end{pmatrix}, \ C=\begin{pmatrix}i & j \\ k & \ell\end{pmatrix}$.

      Then we have the following: \begin{align*}(A\cdot B)\cdot C&=\left (\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}e & f \\ g & h\end{pmatrix}\right )\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}= \begin{pmatrix}ae+bg & af+bh \\ ce+dg & cf+dh\end{pmatrix}\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}\\ & = \begin{pmatrix}(ae+bg)i+(af+bh)k & (ae+bg)j+(af+bh)\ell \\ (ce+dg)i+(cf+dh)k & (ce+dg)j+(cf+dh)\ell\end{pmatrix}\\ & = \begin{pmatrix}aei+bgi+afk+bhk & aej+bgj+af\ell+bh\ell \\ cei+dgi+cfk+dhk & cej+dgj+cf\ell+dh\ell\end{pmatrix}\end{align*}

      \begin{align*}A\cdot (B\cdot C)&=\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot\left ( \begin{pmatrix}e & f \\ g & h\end{pmatrix}\cdot \begin{pmatrix}i & j \\ k & \ell\end{pmatrix}\right )= \begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot\begin{pmatrix}ei+fk & ej+f\ell \\ gi+hk & gj+h\ell\end{pmatrix} \\ & = \begin{pmatrix}a(ei+fk)+b(gi+hk) & a(ej+f\ell)+b(gj+h\ell) \\ c(ei+fk)+d(gi+hk) & c(ej+f\ell)+d(gj+h\ell)\end{pmatrix} \\ & = \begin{pmatrix}aei+afk+bgi+bhk & aej+af\ell+bgj+bh\ell \\ cei+cfk+dgi+dhk & cej+cf\ell+dgj+dh\ell\end{pmatrix}\\ & = \begin{pmatrix}aei+bgi+afk+bhk & aej+bgj+af\ell+bh\ell \\ cei+dgi+cfk+dhk & cej+dgj+cf\ell+dh\ell\end{pmatrix}\end{align*}

      The results are the same. Therefore it holds that $(A\cdot B)\cdot C=A\cdot (B\cdot C)$ which means the multiplication over $M_2(\mathbb{R})$ is associative.
    2. Let $A=\begin{pmatrix}a & b \\ c & d\end{pmatrix}, \ B=\begin{pmatrix}e & f \\ g & h\end{pmatrix}$.

      Then we have the following: \begin{equation*}A\cdot B=\begin{pmatrix}a & b \\ c & d\end{pmatrix}\cdot \begin{pmatrix}e & f \\ g & h\end{pmatrix}=\begin{pmatrix}ae+bg & af+bh \\ ce+dg & cf+dh\end{pmatrix} \end{equation*}

      Then we have the following: \begin{equation*}B\cdot A=\begin{pmatrix}e & f \\ g & h\end{pmatrix}\cdot \begin{pmatrix}a & b \\ c & d\end{pmatrix}=\begin{pmatrix}ea+fc & eb+fd \\ ga+hc & gb+hd\end{pmatrix} \end{equation*}

      We see that $A\cdot B\neq B\cdot A$, which means that the multiplication over $M_2(\mathbb{R})$ is not commutative.
    3. The neutral element in respect of the multiplication over $M_2(\mathbb{R})$ is the identity matrix \begin{equation*}I_2=\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\end{equation*}
    Is what I have done correct and complete? (Wondering)
 
Physics news on Phys.org
Hey mathmari!

What is the definition of addition in $\mathbb R^n$? (Wondering)
It should be defined in terms of addition on $\mathbb R$.
And of scalar multiplication?

The rest looks all good to me. (Smile)
 
Klaas van Aarsen said:
What is the definition of addition in $\mathbb R^n$? (Wondering)
It should be defined in terms of addition on $\mathbb R$.
And of scalar multiplication?

We define addition in $\mathbb{R}^n$ componentwise: $x+y=(x_1, \ldots , x_n)+(y_1, \ldots , y_n)=(x_1+y_1, \ldots , x_n+y_n)$.
We have that $0_{\mathbb{R}^n}=(0,\ldots , 0)$.

So we have that \begin{equation*} v+0_{\mathbb{R}^n}=(v_1, \ldots , v_n)+(0,\ldots , 0)=(v_1+0, \ldots , v_n+0)=(v_1, \ldots , v_n)=v\end{equation*}
Similarily, we get \begin{equation*}0_{\mathbb{R}^n}+v=(0,\ldots , 0)+(v_1, \ldots , v_n)=(0+v_1, \ldots , 0+v_n)=(v_1, \ldots , v_n)=v\end{equation*}

Therefore, it holds that $v+0_{\mathbb{R}^n}=v=0_{\mathbb{R}^n}+v$.
The scalar multiplication is defined as follows:
$\lambda x=\lambda (x_1, \ldots , x_n)=(\lambda x_1, \ldots , \lambda x_n)$.

Then we get \begin{align*}\lambda (v+w)&=\lambda \left ((v_1, \ldots , v_n)+(w_1, \ldots , w_n)\right )=\lambda \left ((v_1+w_1, \ldots , v_n+w_n)\right ) \\ & =(\lambda(v_1+w_1), \ldots , \lambda(v_n+w_n))=(\lambda v_1+\lambda w_1, \ldots , \lambda v_n+\lambda w_n) \\ & =(\lambda v_1, \ldots , \lambda v_n)+(\lambda w_1, \ldots , \lambda w_n) =\lambda( v_1, \ldots , v_n)+ \lambda (w_1, \ldots , w_n) \\ & =\lambda v+\lambda w\end{align*} Are these proofs correct and complete? (Wondering)
 
Yep. All correct. (Nod)
 
Klaas van Aarsen said:
Yep. All correct. (Nod)

Great! Thank you! (Sun)
 
mathmari said:
We see that $A\cdot B\neq B\cdot A$
For which values of $a,\ldots,h$? For all? For some? Then for which exactly? Expressions $a^{3}+b^{3}+c^{3}-3abc$ and $(a+b+c)(a^{2} +b^{2}+c^{2}-ab-bc-ca)$ also look quite differently, yet they are equal.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 52 ·
2
Replies
52
Views
4K
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K