MHB Diagonalizable transformation - Existence of basis

  • #31
mathmari said:
So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent?
That depends on how we set up the proof.
Perhaps we can start with the assumption that v and w are independent.
When the proof is complete, perhaps we won't have to make the distinction any more. 🤔
 
Physics news on Phys.org
  • #32
Klaas van Aarsen said:
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔

But here we don't have the rotation matrix, do we? :unsure:
 
  • #33
mathmari said:
But here we don't have the rotation matrix, do we?
Not yet. 🤔
 
  • #34
Klaas van Aarsen said:
Not yet. 🤔

I got stuck right now. What do we do next? How do we get the rotation matrix? :unsure:
 
  • #35
mathmari said:
I got stuck right now. What do we do next? How do we get the rotation matrix?
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$? 🤔
 
  • #36
Klaas van Aarsen said:
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$? 🤔

It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=\sigma_w(b_3)=\sigma_w(b_4)=0$, right?
So we get the matrix $\begin{pmatrix}0 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
The product of those matrices is the zero matrix, or not?
:unsure:
 
  • #37
mathmari said:
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
Nope. (Shake)
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get? 🤔
 
  • #38
Klaas van Aarsen said:
Nope. (Shake)
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get? 🤔

It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
The product of those matrices is $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$, or not?
:unsure:
 
  • #39
mathmari said:
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
Better. :cool:
But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$.
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$. 🤔
 
  • #40
Klaas van Aarsen said:
Better. :cool:
But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$. 🤔

Yes in this case we suppose that $w, b_3, b_4$ are orthogonal to $v$, right? :unsure:

So is this the resulting matrix and we find an $\alpha$ to get this one? :unsure:
 
  • #41
Klaas van Aarsen said:
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$. 🤔

So do we have to do the same in the case that $v$ and $w$ are not orthogonal? :unsure:
 
  • #42
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not. (Sweating)
And of course we can take a look at the special case that $v$ and $w$ are orthogonal and see that it works out.
 
  • #43
Klaas van Aarsen said:
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not. (Sweating)

If $v$ and $w$ are not independent.
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w-(w\cdot v)w=(1-w\cdot v)w$, $\sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & (1-w\cdot v) & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v-(v\cdot w)v=(1-v\cdot w)v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}(1-v\cdot w) & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.

Is that correct? :unsure:
 
  • #44
mathmari said:
If $v$ and $w$ are not independent.
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$. (Tauri)
 
  • #45
And we have $\sigma_v(w)=w-2(w\cdot v)v$, don't we? :unsure:
 
  • #46
Klaas van Aarsen said:
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$. (Tauri)

So $v$ and $w$ must be independent? Or do we dosomething else in that case? :unsure:
 
  • #47
mathmari said:
So $v$ and $w$ must be independent? Or do we dosomething else in that case?
Not necessarily. It's just a different case. 🤔
If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.
 
  • #48
Klaas van Aarsen said:
Not necessarily. It's just a different case. 🤔
If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.

So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ? :unsure:
 
  • #49
mathmari said:
So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ?
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.
 
  • #50
Klaas van Aarsen said:
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.

But how do we get that more general case, so that the matrix depends on $\alpha$ ? :unsure:
 
  • #51
Let's see what $\sigma_v$ looks like with respect to the basis $(v,w,b_3,b_4)$ assuming that $v$ and $w$ are independent.

We have $\sigma_v(w)=w-2(w\cdot v)v$.
So:
$$D_{\sigma_v} = \begin{pmatrix}-1&-2w\cdot v\\&1\\&&1\\&&&1\end{pmatrix}$$

Similarly we have $\sigma_w(v)=v-2(v\cdot w)w$. So:
$$D_{\sigma_w} = \begin{pmatrix}1\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
And:
$$D_{\sigma_v} D_{\sigma_w}= \begin{pmatrix}-1+4v\cdot w&2v\cdot w\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$

Close, isn't it? But not quite there. 🤔

We do see that we have a top left 2x2 matrix and otherwise the identity matrix.
And we already know that 2 reflections are supposed to make a rotation in 2 dimensions.
But that is with respect to an orthonormal basis.
So suppose we construct an orthonormal basis.
We can start with $v$.
We can pick a second vector $b_2$ perpendicular to $v$ with unit length such that $w$ is a linear combination of $v$ and $b_2$.
If $v$ and $w$ are independent, then we can construct it with $\tilde b_2 =w-(w\cdot v)v$ and then pick $b_2=\tilde b_2/\|\tilde b_2\|$.
What can we do if $v$ and $w$ are dependent? 🤔
Finally we pick $b_3$ and $b_4$ so that they are orthonormal to $v$ and $b_2$.

The resulting matrix $D_{\sigma_v} D_{\sigma_w}$ will again have a top left 2x2 matrix and will otherwise be the identity matrix, won't it? 🤔
 
Last edited:
  • #52
So in general, is $\alpha$ the angle between $v$ and $w$ ? :unsure:
 
  • #53
mathmari said:
So in general, is $\alpha$ the angle between $v$ and $w$ ?
$\alpha$ is 2 times the angle between v and w. 🤔
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
1K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
3
Views
3K
  • · Replies 26 ·
Replies
26
Views
679
  • · Replies 23 ·
Replies
23
Views
2K