Diagonalizable transformation - Existence of basis

In summary: Yes, since we are in $\mathbb R^4$. 🤔Exactly. So let's call this basis $B$. (Nod)Let's also call the first two basis vectors $B_{\text{first}}$ and the last two basis vectors $B_{\text{last}}$. (Nod)If we knew that $\sigma_v\circ\sigma_w$ sends the first two basis vectors to themselves and the last two basis vectors to themselves, that would be great! 🧐So, does it? 🤔In summary, the conversation discusses a transformation in $\mathbb{R}^n$ defined by a dot product and its properties. It
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :giggle:

Let $1\leq n\in \mathbb{N}$ and for $x=\begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_n\end{pmatrix}, \ x=\begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_n\end{pmatrix}\in \mathbb{R}^n$ and let $x\cdot y=\sum_{i=1}^nx_iy_i$ the dot product of $x$ and $y$.

Let $S=\{v\in \mathbb{R}^n\mid v\cdot v=1\}$ and for $v\in S$ let $\sigma_v$ be a map defined by $\sigma_v:\mathbb{R}^n\rightarrow \mathbb{R}^n, \ x\mapsto x-2(x\cdot v)v$.

I have shown that it holds for $v\in S$ and $x,y\in \mathbb{R}^n$ that $\sigma_v(x)\cdot \sigma_v(y)=x\cdot y$.

Let $v\in S$. I have shown that $\sigma_v^2=\text{id}_{\mathbb{R}^n}$. To show that $\sigma_v$ is diagonalizable do we have to calculate the matrix of that transformation?
Let $n=4$ and $v,w\in S$. I want to show that there is $0\leq \alpha\leq 2\pi$ and an orthogonal basis $B$ of $\mathbb{R}^4$ such that the matrix $\sigma_v\circ\sigma_w$ as for the basis $B$ is of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ Could you give me a hint for that? Do we have to find the matrix of $\sigma_v\circ\sigma_w$ and calculate the images of the elements of the basis $B$ and write then write the result as a linear combination of the elements of $B$ and the result should be the matrix $D$? :unsure:
 
Physics news on Phys.org
  • #2
Hey mathmari!

A matrix is diagonalizable iff the eigenvectors form a basis.
This is actually the definition of diagonalizable as it applies to transformations in general.
So it suffices if we can show that $\sigma_v$ has $n$ independent eigenvectors. 🧐

Is $v$ an eigenvector? 🤔
What about a vector perpendicular to $v$? 🤔
 
  • #3
Klaas van Aarsen said:
So it suffices if we can show that $\sigma_v$ has $n$ independent eigenvectors. 🧐

Is $v$ an eigenvector? 🤔
What about a vector perpendicular to $v$? 🤔
$v$ is an eigenvector if $\sigma_v(v)=\lambda v$ for some $\lambda\in \mathbb{R}$, right?
We have that $ \sigma_v(v)=v-2(v\cdot v)v =v-2v=-v=(-1)v $. That mens that $v$ is an eigenvector for the eigenvalue $\lambda=-1$.
Is that correct? :unsure:

Let $w$ be a vector perpendicular to $v$, then $w\cdot v=0$.
We have that $ \sigma_v(w)=w-2(w\cdot v)v =w$. That mens that $w$ is an eigenvector for the eigenvalue $\lambda=1$.
Is that correct? :unsure:

Now we have find two independent eigenvectors, but we need $n$. :unsure:
 
  • #4
All correct. (Nod)

How many independent vectors can we find that are perpendicular to $v$? 🤔
 
  • #5
Klaas van Aarsen said:
All correct. (Nod)

How many independent vectors can we find that are perpendicular to $v$? 🤔

Are there $n-1$ perpendicular vectors to $v$ since we are in $\mathbb{R}^n$ ? :unsure:
 
  • #6
mathmari said:
Are there $n-1$ perpendicular vectors to $v$ since we are in $\mathbb{R}^n$ ? :unsure:
Yes. Any orthogonal basis of $\mathbb R^n$ that includes $v$, contains $n-1$ vectors that are orthogonal to $v$. 🤔
 
  • #7
Klaas van Aarsen said:
Yes. Any orthogonal basis of $\mathbb R^n$ that includes $v$, contains $n-1$ vectors that are orthogonal to $v$. 🤔

Ahh ok!

If we want to determine the eigenvalues of $\sigma_v$ then do we do the following?
$$\sigma_v(x)=\lambda x \Rightarrow \sigma_v\left (\sigma_v(x)\right )=\sigma_v\left (\lambda x \right ) \Rightarrow \sigma_v^2(x)=\lambda \sigma_v(x) \Rightarrow x=\lambda \sigma_v(x)\Rightarrow x=\lambda \cdot \lambda x\Rightarrow x=\lambda^2 x\Rightarrow (\lambda^2-1) x=0\Rightarrow \lambda=\pm 1$$ So the eigenvalues are $-1$ and $1$. Is that corect? :unsure:

To find the dimension of the respective eigenspace do we calculate the geometric multiplicity which has to be equal to the algebraic multiplicity since $\sigma_v$ is diagonizable? Or how do we calculate the dimension in this case? :unsure:
 
Last edited by a moderator:
  • #8
Yes. That works in both cases. (Nod)

Note that we've already found $1$ eigenvector $v$ for the eigenvalue $\lambda=-1$, and $n-1$ independent eigenvectors for the eigenvalue $\lambda=1$.
Since the dimension of the space is $n$, that implies that $\lambda=-1$ has both algebraic and geometric multiplicity of $1$.
And $\lambda=1$ has both algebraic and geometric multiplicity of $n-1$.
That is, we don't need to use the argument of diagonalizability to conclude that. 🧐
 
  • #9
Klaas van Aarsen said:
Since the dimension of the space is $n$, that implies that $\lambda=-1$ has both algebraic and geometric multiplicity of $1$.
And $\lambda=1$ has both algebraic and geometric multiplicity of $n-1$.

How do we know that the algebraic multiplicity of $\lambda=1$ is $n-1$ ? :unsure:
 
  • #10
mathmari said:
How do we know that the algebraic multiplicity of $\lambda=1$ is $n-1$ ?
Because an eigenvalue's geometric multiplicity cannot exceed its algebraic multiplicity. 🤔
 
  • #11
Klaas van Aarsen said:
Because an eigenvalue's geometric multiplicity cannot exceed its algebraic multiplicity. 🤔

Ah because there are $n-1$ vectors like $w$, i.e. there are $n-1$ eigenvectors for $\lambda=1$ that means that the geometric multiplicity is $n-1$ ? :unsure:
 
  • #12
mathmari said:
Ah because there are $n-1$ vectors like $w$, i.e. there are $n-1$ eigenvectors for $\lambda=1$ that means that the geometric multiplicity is $n-1$ ?
Yep. (Nod)
 
  • #13
Ok! :geek:
mathmari said:
Let $n=4$ and $v,w\in S$. I want to show that there is $0\leq \alpha\leq 2\pi$ and an orthogonal basis $B$ of $\mathbb{R}^4$ such that the matrix $\sigma_v\circ\sigma_w$ as for the basis $B$ is of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ Could you give me a hint for that? Do we have to find the matrix of $\sigma_v\circ\sigma_w$ and calculate the images of the elements of the basis $B$ and write then write the result as a linear combination of the elements of $B$ and the result should be the matrix $D$? :unsure:

Could you give me a hint also for that? :unsure:
 
  • #14
mathmari said:
Could you give me a hint also for that?
Can we find a basis of $\mathbb R^4$ that has 2 vectors in it that are orthogonal to both $v$ and $w$? 🤔
 
Last edited:
  • #15
Klaas van Aarsen said:
Can we find a basis of $\mathbb R^n$ that has 2 more vectors in it that are orthogonal to both $v$ and $w$? 🤔

Their dot product is one vector that it orthogonal to both, right? How can we find some more? I got stuck right now. :unsure:
 
  • #16
The dot product is a scalar and not a vector. Furthermore, the cross product is not defined in 4 dimensions. :oops:

Can't we just state that such a basis must exist?
We don't have to actually find such vectors. 🤔

Either way, we can find them by starting with v and w, and by adding each of the unit vectors until we have 4 independent vectors. After that we can use the Gramm-Scmidt orthogonalization process to find 2 vectors that are orthogonal to both v and w. 🤔
 
  • #17
I don't mean to butt in but I would just like to say how much I am enjoying these threads. I know the material in general but I'm getting some extra details that I have missed. Both of you keep up the good work!

-Dan
 
  • #18
Klaas van Aarsen said:
Either way, we can find them by starting with v and w, and by adding each of the unit vectors until we have 4 independent vectors. After that we can use the Gramm-Scmidt orthogonalization process to find 2 vectors that are orthogonal to both v and w. 🤔

So that means that $B$ will be then the set of $v$, $w$ and the two vectors that we get from the Gramm-Scmidt orthogonalization process? :unsure:

But just stating that such a basis exist, how can we find the form of the matrix $D$ ? I got stuck right now. :unsure:
 
  • #19
What do $\sigma_v$ and $\sigma_w$ look like with respect to a basis that contains v, w, and vectors orthogonal to both v and w? 🤔
 
  • #20
Klaas van Aarsen said:
What do $\sigma_v$ and $\sigma_w$ look like with respect to a basis that contains v, w, and vectors orthogonal to both v and w? 🤔

They are invariant, aren't they? :unsure:
 
  • #21
mathmari said:
They are invariant, aren't they?
The extra orthogonal vectors are indeed invariant with respect to both $\sigma_v$ and $\sigma_w$. (Nod)

So? (Wondering)
 
  • #22
Klaas van Aarsen said:
The extra orthogonal vectors are indeed invariant with respect to both $\sigma_v$ and $\sigma_w$. (Nod)

So? (Wondering)

That's why we get the last two columns of the matrix $D$, right? :unsure:
 
  • #23
mathmari said:
That's why we get the last two columns of the matrix $D$, right?
Yep.
Both $\sigma_v$ and $\sigma_w$ have a matrix with respect to that basis that have the same last two columns as $D$. 🤔
 
  • #24
Klaas van Aarsen said:
Yep.
Both $\sigma_v$ and $\sigma_w$ have a matrix with respect to that basis that have the same last two columns as $D$. 🤔

So the first two columns of $D$ correspond to the vectors $v$ and $w$ ? :unsure:
 
  • #25
mathmari said:
So the first two columns of $D$ correspond to the vectors $v$ and $w$ ?
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔
 
Last edited:
  • #26
Klaas van Aarsen said:
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔

Ok, but shouldn't these vectors be then the first and second column of $D$? How do we get then the cosine and the sine? :unsure:
 
  • #27
We can reduce the 4-dimensional problem to a 2-dimensional problem.
And we already know that the composition of two reflections in 2 dimensions is a rotation don't we? :unsure:
 
  • #28
Klaas van Aarsen said:
We can reduce the 4-dimensional problem to a 2-dimensional problem.
And we already know that the composition of two reflections in 2 dimensions is a rotation don't we? :unsure:

Ah so we consider the rotation matrix, right? :unsure:

So $\sigma_v$ and $\sigma_w$ is the composition of two reflections, which is a rotation. Therefore the matrix $D$ must contain for these vectors the rotation matrix? :unsure:
 
  • #29
mathmari said:
Ah so we consider the rotation matrix, right?

So $\sigma_v$ and $\sigma_w$ is the composition of two reflections, which is a rotation. Therefore the matrix $D$ must contain for these vectors the rotation matrix?
Yes. (Nod)

Generally, the composition of 2 reflections is a rotation of double the angle between the normals of the planes of reflection. 🧐
 
  • #30
Klaas van Aarsen said:
Yes. (Nod)

Generally, the composition of 2 reflections is a rotation of double the angle between the normals of the planes of reflection. 🧐

So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent? :unsure:
 
  • #31
mathmari said:
So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent?
That depends on how we set up the proof.
Perhaps we can start with the assumption that v and w are independent.
When the proof is complete, perhaps we won't have to make the distinction any more. 🤔
 
  • #32
Klaas van Aarsen said:
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔

But here we don't have the rotation matrix, do we? :unsure:
 
  • #33
mathmari said:
But here we don't have the rotation matrix, do we?
Not yet. 🤔
 
  • #34
Klaas van Aarsen said:
Not yet. 🤔

I got stuck right now. What do we do next? How do we get the rotation matrix? :unsure:
 
  • #35
mathmari said:
I got stuck right now. What do we do next? How do we get the rotation matrix?
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$? 🤔
 

Similar threads

  • Linear and Abstract Algebra
Replies
34
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
986
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
877
  • Linear and Abstract Algebra
Replies
15
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
975
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
17
Views
2K
Replies
31
Views
2K
Back
Top