# Diagonalizable transformation - Existence of basis

• MHB
• mathmari
Yes, since we are in $\mathbb R^4$. 🤔Exactly. So let's call this basis $B$. (Nod)Let's also call the first two basis vectors $B_{\text{first}}$ and the last two basis vectors $B_{\text{last}}$. (Nod)If we knew that $\sigma_v\circ\sigma_w$ sends the first two basis vectors to themselves and the last two basis vectors to themselves, that would be great! 🧐So, does it? 🤔In summary, the conversation discusses a transformation in $\mathbb{R}^n$ defined by a dot product and its properties. It
Klaas van Aarsen said:
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$? It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=\sigma_w(b_3)=\sigma_w(b_4)=0$, right?
So we get the matrix $\begin{pmatrix}0 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
The product of those matrices is the zero matrix, or not?
:unsure:

mathmari said:
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
Nope. (Shake)
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get? Klaas van Aarsen said:
Nope. (Shake)
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get? It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
The product of those matrices is $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$, or not?
:unsure:

mathmari said:
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
Better. But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$.
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$. Klaas van Aarsen said:
Better. But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$. Yes in this case we suppose that $w, b_3, b_4$ are orthogonal to $v$, right? :unsure:

So is this the resulting matrix and we find an $\alpha$ to get this one? :unsure:

Klaas van Aarsen said:
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$. So do we have to do the same in the case that $v$ and $w$ are not orthogonal? :unsure:

We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not. (Sweating)
And of course we can take a look at the special case that $v$ and $w$ are orthogonal and see that it works out.

Klaas van Aarsen said:
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not. (Sweating)

If $v$ and $w$ are not independent.
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w-(w\cdot v)w=(1-w\cdot v)w$, $\sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & (1-w\cdot v) & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v-(v\cdot w)v=(1-v\cdot w)v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}(1-v\cdot w) & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.

Is that correct? :unsure:

mathmari said:
If $v$ and $w$ are not independent.
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$. (Tauri)

And we have $\sigma_v(w)=w-2(w\cdot v)v$, don't we? :unsure:

Klaas van Aarsen said:
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$. (Tauri)

So $v$ and $w$ must be independent? Or do we dosomething else in that case? :unsure:

mathmari said:
So $v$ and $w$ must be independent? Or do we dosomething else in that case?
Not necessarily. It's just a different case. If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.

Klaas van Aarsen said:
Not necessarily. It's just a different case. If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.

So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ? :unsure:

mathmari said:
So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ?
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.

Klaas van Aarsen said:
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.

But how do we get that more general case, so that the matrix depends on $\alpha$ ? :unsure:

Let's see what $\sigma_v$ looks like with respect to the basis $(v,w,b_3,b_4)$ assuming that $v$ and $w$ are independent.

We have $\sigma_v(w)=w-2(w\cdot v)v$.
So:
$$D_{\sigma_v} = \begin{pmatrix}-1&-2w\cdot v\\&1\\&&1\\&&&1\end{pmatrix}$$

Similarly we have $\sigma_w(v)=v-2(v\cdot w)w$. So:
$$D_{\sigma_w} = \begin{pmatrix}1\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
And:
$$D_{\sigma_v} D_{\sigma_w}= \begin{pmatrix}-1+4v\cdot w&2v\cdot w\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$

Close, isn't it? But not quite there. We do see that we have a top left 2x2 matrix and otherwise the identity matrix.
And we already know that 2 reflections are supposed to make a rotation in 2 dimensions.
But that is with respect to an orthonormal basis.
So suppose we construct an orthonormal basis.
We can start with $v$.
We can pick a second vector $b_2$ perpendicular to $v$ with unit length such that $w$ is a linear combination of $v$ and $b_2$.
If $v$ and $w$ are independent, then we can construct it with $\tilde b_2 =w-(w\cdot v)v$ and then pick $b_2=\tilde b_2/\|\tilde b_2\|$.
What can we do if $v$ and $w$ are dependent? Finally we pick $b_3$ and $b_4$ so that they are orthonormal to $v$ and $b_2$.

The resulting matrix $D_{\sigma_v} D_{\sigma_w}$ will again have a top left 2x2 matrix and will otherwise be the identity matrix, won't it? Last edited:
So in general, is $\alpha$ the angle between $v$ and $w$ ? :unsure:

mathmari said:
So in general, is $\alpha$ the angle between $v$ and $w$ ?
$\alpha$ is 2 times the angle between v and w. 