MHB Diagonalizable transformation - Existence of basis

Click For Summary
SUMMARY

The forum discussion centers on the diagonalizability of the transformation defined by the map $\sigma_v:\mathbb{R}^n\rightarrow \mathbb{R}^n$, where $\sigma_v(x)=x-2(x\cdot v)v$ for $v$ in the unit sphere $S$. The participants establish that $\sigma_v$ has eigenvalues of $-1$ and $1$, with $v$ as an eigenvector corresponding to $-1$ and $n-1$ independent eigenvectors perpendicular to $v$ corresponding to $1$. The discussion concludes that the transformation can be represented in an orthogonal basis as a rotation matrix, specifically of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ for some angle $\alpha$.

PREREQUISITES
  • Understanding of linear transformations in $\mathbb{R}^n$
  • Familiarity with eigenvalues and eigenvectors
  • Knowledge of orthogonal bases and Gram-Schmidt process
  • Concept of diagonalizability in linear algebra
NEXT STEPS
  • Study the properties of reflections and their relationship to rotations in linear algebra
  • Learn about the Gram-Schmidt orthogonalization process for constructing orthogonal bases
  • Explore the concept of eigenvalue multiplicity and its implications for diagonalizability
  • Investigate the geometric interpretation of linear transformations in $\mathbb{R}^n$
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in understanding the properties of linear transformations and their diagonalizability in higher dimensions.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :giggle:

Let $1\leq n\in \mathbb{N}$ and for $x=\begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_n\end{pmatrix}, \ x=\begin{pmatrix}x_1\\ x_2\\ \vdots \\ x_n\end{pmatrix}\in \mathbb{R}^n$ and let $x\cdot y=\sum_{i=1}^nx_iy_i$ the dot product of $x$ and $y$.

Let $S=\{v\in \mathbb{R}^n\mid v\cdot v=1\}$ and for $v\in S$ let $\sigma_v$ be a map defined by $\sigma_v:\mathbb{R}^n\rightarrow \mathbb{R}^n, \ x\mapsto x-2(x\cdot v)v$.

I have shown that it holds for $v\in S$ and $x,y\in \mathbb{R}^n$ that $\sigma_v(x)\cdot \sigma_v(y)=x\cdot y$.

Let $v\in S$. I have shown that $\sigma_v^2=\text{id}_{\mathbb{R}^n}$. To show that $\sigma_v$ is diagonalizable do we have to calculate the matrix of that transformation?
Let $n=4$ and $v,w\in S$. I want to show that there is $0\leq \alpha\leq 2\pi$ and an orthogonal basis $B$ of $\mathbb{R}^4$ such that the matrix $\sigma_v\circ\sigma_w$ as for the basis $B$ is of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ Could you give me a hint for that? Do we have to find the matrix of $\sigma_v\circ\sigma_w$ and calculate the images of the elements of the basis $B$ and write then write the result as a linear combination of the elements of $B$ and the result should be the matrix $D$? :unsure:
 
Physics news on Phys.org
Hey mathmari!

A matrix is diagonalizable iff the eigenvectors form a basis.
This is actually the definition of diagonalizable as it applies to transformations in general.
So it suffices if we can show that $\sigma_v$ has $n$ independent eigenvectors. 🧐

Is $v$ an eigenvector? 🤔
What about a vector perpendicular to $v$? 🤔
 
Klaas van Aarsen said:
So it suffices if we can show that $\sigma_v$ has $n$ independent eigenvectors. 🧐

Is $v$ an eigenvector? 🤔
What about a vector perpendicular to $v$? 🤔
$v$ is an eigenvector if $\sigma_v(v)=\lambda v$ for some $\lambda\in \mathbb{R}$, right?
We have that $ \sigma_v(v)=v-2(v\cdot v)v =v-2v=-v=(-1)v $. That mens that $v$ is an eigenvector for the eigenvalue $\lambda=-1$.
Is that correct? :unsure:

Let $w$ be a vector perpendicular to $v$, then $w\cdot v=0$.
We have that $ \sigma_v(w)=w-2(w\cdot v)v =w$. That mens that $w$ is an eigenvector for the eigenvalue $\lambda=1$.
Is that correct? :unsure:

Now we have find two independent eigenvectors, but we need $n$. :unsure:
 
All correct. (Nod)

How many independent vectors can we find that are perpendicular to $v$? 🤔
 
Klaas van Aarsen said:
All correct. (Nod)

How many independent vectors can we find that are perpendicular to $v$? 🤔

Are there $n-1$ perpendicular vectors to $v$ since we are in $\mathbb{R}^n$ ? :unsure:
 
mathmari said:
Are there $n-1$ perpendicular vectors to $v$ since we are in $\mathbb{R}^n$ ? :unsure:
Yes. Any orthogonal basis of $\mathbb R^n$ that includes $v$, contains $n-1$ vectors that are orthogonal to $v$. 🤔
 
Klaas van Aarsen said:
Yes. Any orthogonal basis of $\mathbb R^n$ that includes $v$, contains $n-1$ vectors that are orthogonal to $v$. 🤔

Ahh ok!

If we want to determine the eigenvalues of $\sigma_v$ then do we do the following?
$$\sigma_v(x)=\lambda x \Rightarrow \sigma_v\left (\sigma_v(x)\right )=\sigma_v\left (\lambda x \right ) \Rightarrow \sigma_v^2(x)=\lambda \sigma_v(x) \Rightarrow x=\lambda \sigma_v(x)\Rightarrow x=\lambda \cdot \lambda x\Rightarrow x=\lambda^2 x\Rightarrow (\lambda^2-1) x=0\Rightarrow \lambda=\pm 1$$ So the eigenvalues are $-1$ and $1$. Is that corect? :unsure:

To find the dimension of the respective eigenspace do we calculate the geometric multiplicity which has to be equal to the algebraic multiplicity since $\sigma_v$ is diagonizable? Or how do we calculate the dimension in this case? :unsure:
 
Last edited by a moderator:
Yes. That works in both cases. (Nod)

Note that we've already found $1$ eigenvector $v$ for the eigenvalue $\lambda=-1$, and $n-1$ independent eigenvectors for the eigenvalue $\lambda=1$.
Since the dimension of the space is $n$, that implies that $\lambda=-1$ has both algebraic and geometric multiplicity of $1$.
And $\lambda=1$ has both algebraic and geometric multiplicity of $n-1$.
That is, we don't need to use the argument of diagonalizability to conclude that. 🧐
 
Klaas van Aarsen said:
Since the dimension of the space is $n$, that implies that $\lambda=-1$ has both algebraic and geometric multiplicity of $1$.
And $\lambda=1$ has both algebraic and geometric multiplicity of $n-1$.

How do we know that the algebraic multiplicity of $\lambda=1$ is $n-1$ ? :unsure:
 
  • #10
mathmari said:
How do we know that the algebraic multiplicity of $\lambda=1$ is $n-1$ ?
Because an eigenvalue's geometric multiplicity cannot exceed its algebraic multiplicity. 🤔
 
  • #11
Klaas van Aarsen said:
Because an eigenvalue's geometric multiplicity cannot exceed its algebraic multiplicity. 🤔

Ah because there are $n-1$ vectors like $w$, i.e. there are $n-1$ eigenvectors for $\lambda=1$ that means that the geometric multiplicity is $n-1$ ? :unsure:
 
  • #12
mathmari said:
Ah because there are $n-1$ vectors like $w$, i.e. there are $n-1$ eigenvectors for $\lambda=1$ that means that the geometric multiplicity is $n-1$ ?
Yep. (Nod)
 
  • #13
Ok! :geek:
mathmari said:
Let $n=4$ and $v,w\in S$. I want to show that there is $0\leq \alpha\leq 2\pi$ and an orthogonal basis $B$ of $\mathbb{R}^4$ such that the matrix $\sigma_v\circ\sigma_w$ as for the basis $B$ is of the form $$D=\begin{pmatrix}\cos \alpha & -\sin \alpha & 0 & 0 \\ \sin \alpha & \cos \alpha & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1\end{pmatrix}$$ Could you give me a hint for that? Do we have to find the matrix of $\sigma_v\circ\sigma_w$ and calculate the images of the elements of the basis $B$ and write then write the result as a linear combination of the elements of $B$ and the result should be the matrix $D$? :unsure:

Could you give me a hint also for that? :unsure:
 
  • #14
mathmari said:
Could you give me a hint also for that?
Can we find a basis of $\mathbb R^4$ that has 2 vectors in it that are orthogonal to both $v$ and $w$? 🤔
 
Last edited:
  • #15
Klaas van Aarsen said:
Can we find a basis of $\mathbb R^n$ that has 2 more vectors in it that are orthogonal to both $v$ and $w$? 🤔

Their dot product is one vector that it orthogonal to both, right? How can we find some more? I got stuck right now. :unsure:
 
  • #16
The dot product is a scalar and not a vector. Furthermore, the cross product is not defined in 4 dimensions. :oops:

Can't we just state that such a basis must exist?
We don't have to actually find such vectors. 🤔

Either way, we can find them by starting with v and w, and by adding each of the unit vectors until we have 4 independent vectors. After that we can use the Gramm-Scmidt orthogonalization process to find 2 vectors that are orthogonal to both v and w. 🤔
 
  • #17
I don't mean to butt in but I would just like to say how much I am enjoying these threads. I know the material in general but I'm getting some extra details that I have missed. Both of you keep up the good work!

-Dan
 
  • #18
Klaas van Aarsen said:
Either way, we can find them by starting with v and w, and by adding each of the unit vectors until we have 4 independent vectors. After that we can use the Gramm-Scmidt orthogonalization process to find 2 vectors that are orthogonal to both v and w. 🤔

So that means that $B$ will be then the set of $v$, $w$ and the two vectors that we get from the Gramm-Scmidt orthogonalization process? :unsure:

But just stating that such a basis exist, how can we find the form of the matrix $D$ ? I got stuck right now. :unsure:
 
  • #19
What do $\sigma_v$ and $\sigma_w$ look like with respect to a basis that contains v, w, and vectors orthogonal to both v and w? 🤔
 
  • #20
Klaas van Aarsen said:
What do $\sigma_v$ and $\sigma_w$ look like with respect to a basis that contains v, w, and vectors orthogonal to both v and w? 🤔

They are invariant, aren't they? :unsure:
 
  • #21
mathmari said:
They are invariant, aren't they?
The extra orthogonal vectors are indeed invariant with respect to both $\sigma_v$ and $\sigma_w$. (Nod)

So? (Wondering)
 
  • #22
Klaas van Aarsen said:
The extra orthogonal vectors are indeed invariant with respect to both $\sigma_v$ and $\sigma_w$. (Nod)

So? (Wondering)

That's why we get the last two columns of the matrix $D$, right? :unsure:
 
  • #23
mathmari said:
That's why we get the last two columns of the matrix $D$, right?
Yep.
Both $\sigma_v$ and $\sigma_w$ have a matrix with respect to that basis that have the same last two columns as $D$. 🤔
 
  • #24
Klaas van Aarsen said:
Yep.
Both $\sigma_v$ and $\sigma_w$ have a matrix with respect to that basis that have the same last two columns as $D$. 🤔

So the first two columns of $D$ correspond to the vectors $v$ and $w$ ? :unsure:
 
  • #25
mathmari said:
So the first two columns of $D$ correspond to the vectors $v$ and $w$ ?
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔
 
Last edited:
  • #26
Klaas van Aarsen said:
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔

Ok, but shouldn't these vectors be then the first and second column of $D$? How do we get then the cosine and the sine? :unsure:
 
  • #27
We can reduce the 4-dimensional problem to a 2-dimensional problem.
And we already know that the composition of two reflections in 2 dimensions is a rotation don't we? :unsure:
 
  • #28
Klaas van Aarsen said:
We can reduce the 4-dimensional problem to a 2-dimensional problem.
And we already know that the composition of two reflections in 2 dimensions is a rotation don't we? :unsure:

Ah so we consider the rotation matrix, right? :unsure:

So $\sigma_v$ and $\sigma_w$ is the composition of two reflections, which is a rotation. Therefore the matrix $D$ must contain for these vectors the rotation matrix? :unsure:
 
  • #29
mathmari said:
Ah so we consider the rotation matrix, right?

So $\sigma_v$ and $\sigma_w$ is the composition of two reflections, which is a rotation. Therefore the matrix $D$ must contain for these vectors the rotation matrix?
Yes. (Nod)

Generally, the composition of 2 reflections is a rotation of double the angle between the normals of the planes of reflection. 🧐
 
  • #30
Klaas van Aarsen said:
Yes. (Nod)

Generally, the composition of 2 reflections is a rotation of double the angle between the normals of the planes of reflection. 🧐

So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent? :unsure:
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 26 ·
Replies
26
Views
887
  • · Replies 23 ·
Replies
23
Views
2K