Eigenvectors and orthogonal basis

Click For Summary
The discussion revolves around demonstrating that a specific linear transformation in ##\mathbb{R}^3## does not have an orthogonal basis of eigenvectors, despite having identified possible eigenvectors. The user notes that the eigenvectors are not mutually orthogonal, as evidenced by their dot products. They consider using Gram-Schmidt but recognize it would alter the eigenvectors. Additionally, the implications of having orthogonal eigenvectors are highlighted, including benefits for geometric interpretations, numeric stability, and ease of manipulation in quadratic forms. Ultimately, the challenge lies in proving the absence of an orthogonal eigenvector basis for the transformation.
0kelvin
Messages
50
Reaction score
5

Homework Statement


I have a linear transformation ##\mathbb{R}^3 \rightarrow \mathbb{R}^3##. The part that asks for a basis of eigenvectors I've already solved it. The possible eigenvectors are ##(1,-3,0), (1,0,3), (\frac{1}{2}, \frac{1}{2},1) ##. Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

How do I show it?

The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

Homework Equations

The Attempt at a Solution


[/B]
With the euclidean inner product I can clearly see that the eigenvectors are not orthogonal to each other. But I'm not sure if calculating many pairs of dot products is the way to show it.

I thought about Gram-Schmidt but doing that would make the vectors not be eigenvectors anymore.
 
Last edited:
Physics news on Phys.org
If you have got three eigenvectors, then either they are mutually orthogonal or they are not.
 
0kelvin said:
Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

How do I show it?

I suppose you could do a Schur decomposition, and show that you get a matrix ##\mathbf{UTU^*}## where ##\mathbf{T}## has eigenvalues along the diagonal but it is upper triangular only (i.e. it is not both upper triangular and lower triangular -- i.e. not diagonal). Out of curiosity, what are the eigenvalues here?

More importantly:

0kelvin said:
The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

Orthogonal (and unitary if complex) matrices are extremely pleasant to work with. From a geometric standpoint, they are length preserving. Consider some real valued vector ##\mathbf{x}## and an orhogonal matrix ##mathbf{U}##: ##||\mathbf{x}||_2^2 = \mathbf{x^Tx} = ||\mathbf{Ux}||_2^2 =\mathbf{x^TU^TUx} = \mathbf{x^TIx} = \mathbf{x^T x}##. Having mutually orthonormal eigenvectors is immensely useful in manipulating things like quadratic forms -- e.g. maximizing or minimizing ##\mathbf{x^T A x}##, which comes up all the time (e.g. Hessian matrix of second derivatives in calc).

On top of all this, orthogonal matrices are very nice for preserving numeric stability if you're doing serious computational work.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
11
Views
6K