Diagonalizable transformation - Existence of basis

Click For Summary

Discussion Overview

The discussion revolves around the diagonalizability of a transformation defined by a specific map on $\mathbb{R}^n$, particularly focusing on the existence of an orthogonal basis and the eigenvalues associated with the transformation. Participants explore the implications of the transformation's properties, eigenvectors, and the construction of a basis in the context of linear algebra.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants propose that a transformation is diagonalizable if it has $n$ independent eigenvectors.
  • It is suggested that the vector $v$ is an eigenvector for the eigenvalue $\lambda = -1$, while vectors perpendicular to $v$ are eigenvectors for the eigenvalue $\lambda = 1.
  • Participants discuss the number of independent vectors that can be found perpendicular to $v$, suggesting there are $n-1$ such vectors in $\mathbb{R}^n$.
  • There is a proposal to determine the eigenvalues of the transformation by analyzing the equation $\sigma_v(x) = \lambda x$, leading to the conclusion that the eigenvalues are $\lambda = \pm 1$.
  • Some participants question how to ascertain the algebraic multiplicity of the eigenvalue $\lambda = 1$, with references to the relationship between geometric and algebraic multiplicities.
  • There is a suggestion to find a basis of $\mathbb{R}^4$ that includes vectors orthogonal to both $v$ and $w$, and a discussion on how to construct such a basis using the Gram-Schmidt process.
  • Participants express uncertainty about how to derive the form of the matrix $D$ representing the transformation in the chosen basis.

Areas of Agreement / Disagreement

Participants generally agree on the properties of eigenvectors and the implications of diagonalizability, but there are unresolved questions regarding the construction of the basis and the explicit form of the matrix representing the transformation.

Contextual Notes

Limitations include the need for further clarification on the construction of the orthogonal basis and the specific calculations required to express the transformation in matrix form.

Who May Find This Useful

Readers interested in linear algebra, particularly those studying transformations, eigenvalues, and diagonalizability in the context of vector spaces.

  • #31
mathmari said:
So do we have to consider that $v$ and $w$ are independent or do we have to check also the case that they are nont independent?
That depends on how we set up the proof.
Perhaps we can start with the assumption that v and w are independent.
When the proof is complete, perhaps we won't have to make the distinction any more. 🤔
 
Physics news on Phys.org
  • #32
Klaas van Aarsen said:
Let's assume for now that $v$ is independent from $w$.
And let $b_3$ and $b_4$ be vectors that are orthogonal to both $v$ and $w$.
Then we have that $\sigma_v(v)=-v$, so the first column of the matrix of ${\sigma_v}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}-1\\0\\0\\0\end{pmatrix}$.
We also have that $\sigma_w(w)=-w$, so the second column of he matrix of ${\sigma_w}$ with respect to the basis $(v,w,b_3,b_4)$ is $\begin{pmatrix}0\\-1\\0\\0\end{pmatrix}$. 🤔

But here we don't have the rotation matrix, do we? :unsure:
 
  • #33
mathmari said:
But here we don't have the rotation matrix, do we?
Not yet. 🤔
 
  • #34
Klaas van Aarsen said:
Not yet. 🤔

I got stuck right now. What do we do next? How do we get the rotation matrix? :unsure:
 
  • #35
mathmari said:
I got stuck right now. What do we do next? How do we get the rotation matrix?
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$? 🤔
 
  • #36
Klaas van Aarsen said:
What is the matrix of $\sigma_v$ with respect to the basis $(v,w,b_3,b_4)$?
What is the matrix of $\sigma_w$ with respect to the basis $(v,w,b_3,b_4)$?
What is the product of those matrices?
Can we find a more convenient basis so that we get $D$? 🤔

It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=\sigma_w(b_3)=\sigma_w(b_4)=0$, right?
So we get the matrix $\begin{pmatrix}0 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{pmatrix}$.
The product of those matrices is the zero matrix, or not?
:unsure:
 
  • #37
mathmari said:
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=\sigma_v(b_3)=\sigma_v(b_4)=0$, or not?
Nope. (Shake)
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get? 🤔
 
  • #38
Klaas van Aarsen said:
Nope. (Shake)
Suppose we fill in $b_3$ in the formula of $\sigma_v$ and use that $b_3\cdot v=0$, what do we get? 🤔

It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
The product of those matrices is $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$, or not?
:unsure:
 
  • #39
mathmari said:
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w, \sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
Better. :cool:
But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$.
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$. 🤔
 
  • #40
Klaas van Aarsen said:
Better. :cool:
But we do not generally have $\sigma_v(w)=w$. That is only the case if $w$ happens to be orthogonal to $v$. 🤔

Yes in this case we suppose that $w, b_3, b_4$ are orthogonal to $v$, right? :unsure:

So is this the resulting matrix and we find an $\alpha$ to get this one? :unsure:
 
  • #41
Klaas van Aarsen said:
If that is the case, then we did find $D$ for this special case, in which we have $\alpha=\pi$. 🤔

So do we have to do the same in the case that $v$ and $w$ are not orthogonal? :unsure:
 
  • #42
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not. (Sweating)
And of course we can take a look at the special case that $v$ and $w$ are orthogonal and see that it works out.
 
  • #43
Klaas van Aarsen said:
We can always pick $b_3$ and $b_4$, such that they are orthogonal to both $v$ and $w$.
However, we do not get to pick $w$. The vectors $v$ and $w$ are given as part of the problem. We do not know anything about their relationship.
We can only distinguish the cases that they are either independent or not. (Sweating)

If $v$ and $w$ are not independent.
It holds that $\sigma_v(v)=-v$ and $\sigma_v(w)=w-(w\cdot v)w=(1-w\cdot v)w$, $\sigma_v(b_3)=b_3, \sigma_v(b_4)=b_4$, or not?
So we get the matrix $\begin{pmatrix}-1 & 0 & 0 & 0 \\ 0 & (1-w\cdot v) & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.
Respectively, $\sigma_w(w)=-w$ and $\sigma_w(v)=v-(v\cdot w)v=(1-v\cdot w)v, \sigma_w(b_3)=b_3, \sigma_w(b_4)=b_4$, right?
So we get the matrix $\begin{pmatrix}(1-v\cdot w) & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\end{pmatrix}$.

Is that correct? :unsure:
 
  • #44
mathmari said:
If $v$ and $w$ are not independent.
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$. (Tauri)
 
  • #45
And we have $\sigma_v(w)=w-2(w\cdot v)v$, don't we? :unsure:
 
  • #46
Klaas van Aarsen said:
Erm... if $v$ and $w$ are not independent, then they are dependent, and $(v,w,b_3,b_4)$ is not a basis.
Then we cannot write the matrix of $\sigma_v$ with respect to $(v,w,b_3,b_4)$. (Tauri)

So $v$ and $w$ must be independent? Or do we dosomething else in that case? :unsure:
 
  • #47
mathmari said:
So $v$ and $w$ must be independent? Or do we dosomething else in that case?
Not necessarily. It's just a different case. 🤔
If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.
 
  • #48
Klaas van Aarsen said:
Not necessarily. It's just a different case. 🤔
If $v$ and $w$ are dependent, we can pick the basis $(v,b_2,b_3,b_4)$ if we want to with each $b_i$ perpendicular to both $v$ and $w$.
It's another special case that corresponds to $\alpha=0$.

So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ? :unsure:
 
  • #49
mathmari said:
So in general we have these two cases right, one $\alpha=\pi$ and one $\alpha=0$ ?
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.
 
  • #50
Klaas van Aarsen said:
The more 'general' case is when $0<\alpha<\pi$. Of course we also need to ensure that the edge cases are covered.

But how do we get that more general case, so that the matrix depends on $\alpha$ ? :unsure:
 
  • #51
Let's see what $\sigma_v$ looks like with respect to the basis $(v,w,b_3,b_4)$ assuming that $v$ and $w$ are independent.

We have $\sigma_v(w)=w-2(w\cdot v)v$.
So:
$$D_{\sigma_v} = \begin{pmatrix}-1&-2w\cdot v\\&1\\&&1\\&&&1\end{pmatrix}$$

Similarly we have $\sigma_w(v)=v-2(v\cdot w)w$. So:
$$D_{\sigma_w} = \begin{pmatrix}1\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$
And:
$$D_{\sigma_v} D_{\sigma_w}= \begin{pmatrix}-1+4v\cdot w&2v\cdot w\\-2v\cdot w&-1\\&&1\\&&&1\end{pmatrix}$$

Close, isn't it? But not quite there. 🤔

We do see that we have a top left 2x2 matrix and otherwise the identity matrix.
And we already know that 2 reflections are supposed to make a rotation in 2 dimensions.
But that is with respect to an orthonormal basis.
So suppose we construct an orthonormal basis.
We can start with $v$.
We can pick a second vector $b_2$ perpendicular to $v$ with unit length such that $w$ is a linear combination of $v$ and $b_2$.
If $v$ and $w$ are independent, then we can construct it with $\tilde b_2 =w-(w\cdot v)v$ and then pick $b_2=\tilde b_2/\|\tilde b_2\|$.
What can we do if $v$ and $w$ are dependent? 🤔
Finally we pick $b_3$ and $b_4$ so that they are orthonormal to $v$ and $b_2$.

The resulting matrix $D_{\sigma_v} D_{\sigma_w}$ will again have a top left 2x2 matrix and will otherwise be the identity matrix, won't it? 🤔
 
Last edited:
  • #52
So in general, is $\alpha$ the angle between $v$ and $w$ ? :unsure:
 
  • #53
mathmari said:
So in general, is $\alpha$ the angle between $v$ and $w$ ?
$\alpha$ is 2 times the angle between v and w. 🤔
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 26 ·
Replies
26
Views
1K
  • · Replies 23 ·
Replies
23
Views
2K